Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0ADBFD25C for ; Sun, 6 Jan 2013 11:14:36 +0000 (UTC) Received: (qmail 95762 invoked by uid 500); 6 Jan 2013 11:14:31 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 95444 invoked by uid 500); 6 Jan 2013 11:14:30 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 95413 invoked by uid 99); 6 Jan 2013 11:14:29 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 06 Jan 2013 11:14:29 +0000 X-ASF-Spam-Status: No, hits= required= tests= Received-SPF: pass (athena.apache.org: domain of jhancock1975@gmail.com designates 209.85.210.182 as permitted sender) Received: from [209.85.210.182] (HELO mail-ia0-f182.google.com) (209.85.210.182) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 06 Jan 2013 11:14:26 +0000 Received: by mail-ia0-f182.google.com with SMTP id x2so15179916iad.41 for ; Sun, 06 Jan 2013 03:14:05 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=iDaG0nvmWxnnQJZjPu9xa2cP425vUX2JDuRhpv1fnvc=; b=08V8GgO4hOlxbjWpr+ajLhyBJotsxeav0xvBwWOB4XbNQK8jOeXI0Ca+gKHKSO6yRx 7ReGt+YT+1LQdt6jHd4TVjJ5MurWFji8ULHQ2SYW21okdI4spSDPxsGwupb3qaNJablU GfX/P1SZK70Wd8452kS9piZtnJXAR8vP0EKGL2Fhz707lRqs0sfoDv94cp+tu7uq1Xiw pneWqBODmm68Eu6FVqjAcL8mFH8Hp12nZJjmkQKcPMsLCWmXh1npvQ4HnLdeAlPFgrVf hc5Z7gSe9xodYIFIMlmw+IpPlxHALnCAygLqjA4xHuyH20u08N0aR93GQc9uR+JTZanR JBdw== MIME-Version: 1.0 X-Received: by 10.50.149.196 with SMTP id uc4mr3147522igb.74.1357470845359; Sun, 06 Jan 2013 03:14:05 -0800 (PST) Received: by 10.64.49.14 with HTTP; Sun, 6 Jan 2013 03:14:05 -0800 (PST) In-Reply-To: References: Date: Sun, 6 Jan 2013 06:14:05 -0500 Message-ID: Subject: Re: Possible to run an application jar as a hadoop daemon? From: John Hancock To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=e89a8f3b9ccf87d45404d29ccfa4 X-Virus-Checked: Checked by ClamAV on apache.org --e89a8f3b9ccf87d45404d29ccfa4 Content-Type: text/plain; charset=ISO-8859-1 Krishna, You should be able to take the command you are using to start the hadoop job (hadoop jar ..) and paste it into a text file. Then make the file executable and call it as a shell script in a CRON job (crontab -e). To be safe, use absolute paths to reference any files in the command. Or, I suppose what you crazy kids and your object oriented programming would do is use Quartz. -John On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande < chitreshdeshpande@gmail.com> wrote: > Hi Krishna, > > I dont know what do you mean by Hadoop daemon, but if you mean run when > all the other hadoop daemons like namenode, datanode etc are started, then > you can change start-all file in conf directory. > > Thanks and Regards, > Chitresh Deshpande > > > On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao wrote: > >> Hi al, >> >> I have a java application jar that converts some files and writes >> directly into hdfs. >> >> If I want to run the jar I need to run it using "hadoop jar > jar>", so that it can access HDFS (that is running "java -jar > jar> results in a HDFS error"). >> >> Is it possible to run an jar as a hadoop daemon? >> >> Cheers, >> >> Krishna >> > > --e89a8f3b9ccf87d45404d29ccfa4 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Krishna,

You should be able to take the command you are = using to start the hadoop job (hadoop jar ..) and paste it into a text file= . =A0Then make the file executable and call it as a shell script in a CRON = job (crontab -e). =A0To be safe, use absolute paths to reference any files = in the command.

Or, I suppose what you crazy kids and your object orien= ted programming would do is use Quartz.


=
-John

On Sat, Jan 5, 2013 at = 4:33 PM, Chitresh Deshpande <chitreshdeshpande@gmail.com>= wrote:
Hi Krishna,
<= font color=3D"#663300">
I dont know what do you mean by Hadoop daemon, but if = you mean run when all the other hadoop daemons like namenode, datanode etc = are started, then you can change start-all file in conf directory.=A0

Thanks and Regards,=
Chitresh Deshpande


On Fri, Jan 4, 2013 at 6:40 AM, Krishna = Rao <krishnanjrao@gmail.com> wrote:
Hi al,

I have a java application jar that converts some = files and writes directly into hdfs.

If I want to = run the jar I need to run it using "hadoop jar <application jar>= ", so that it can access HDFS (that is running "java -jar <app= lication jar> results in a HDFS error").

Is it possible to run an jar as a hadoop daemon?
<= div>
Cheers,

Krishna


--e89a8f3b9ccf87d45404d29ccfa4--