Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 40669 invoked from network); 29 Jul 2009 17:20:55 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 29 Jul 2009 17:20:55 -0000 Received: (qmail 52107 invoked by uid 500); 29 Jul 2009 17:20:53 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 52030 invoked by uid 500); 29 Jul 2009 17:20:53 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 52020 invoked by uid 99); 29 Jul 2009 17:20:53 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 29 Jul 2009 17:20:53 +0000 X-ASF-Spam-Status: No, hits=2.2 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of harish.mallipeddi@gmail.com designates 209.85.200.170 as permitted sender) Received: from [209.85.200.170] (HELO wf-out-1314.google.com) (209.85.200.170) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 29 Jul 2009 17:20:43 +0000 Received: by wf-out-1314.google.com with SMTP id 23so241284wfg.2 for ; Wed, 29 Jul 2009 10:20:22 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:in-reply-to:references :from:date:message-id:subject:to:content-type; bh=NJOmv31FnW/uotsNLo7pwKXfsiQ/SlGw7v2GmkZA41M=; b=WlyJwjGjmLMdjaDKO/qGuuX75yZjUjduDk5vCvOD2lG/SC7yNZpyjjKxs9LO2xSs97 Ki+v6Esd3rxIKuKxxFZUePpx6ydK04A+o2pTjVwmtnIx0yjR/AHJeT3mVgku/Wi7+bNx ownhkoJvij8hixnINU7bWxrDuWMpWrOLmJC+Y= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; b=s2Hazy6UxYYn0nt0dUEJ9Xu4Chu8Oib4kdI+Inu+CjOybAF1xxvOHMoEZk4THt07Ru xPdgI0M4tSUQbswMk1SjIexMglEp6/vvvHFaxwJmSEQTgzRu4IhW63YEM+y5bq41dh+F mJo+uYRKvrGoS0dpyx+CY2aDR/fZkNlSjzGyc= MIME-Version: 1.0 Received: by 10.142.241.15 with SMTP id o15mr3558wfh.107.1248888022094; Wed, 29 Jul 2009 10:20:22 -0700 (PDT) In-Reply-To: References: From: Harish Mallipeddi Date: Wed, 29 Jul 2009 22:50:02 +0530 Message-ID: Subject: Re: userlogs To: common-user@hadoop.apache.org Content-Type: multipart/alternative; boundary=000e0cd14690ebd50a046fdb66dc X-Virus-Checked: Checked by ClamAV on apache.org --000e0cd14690ebd50a046fdb66dc Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit On Wed, Jul 29, 2009 at 10:06 PM, Marc Limotte wrote: > A quick question about the user logs (hadoop 0.19.1). > > I've looked all over HDFS. I can't locate the actual userlogs. Where are > they stored? I can get to them through the job tracker web interface, but > I'd like to be able to script some jobs to check the logs for errors. I > suppose I could use curl against the web interface, but that seems like a > hack. Alternatively, is there command line access to the userlogs through > the hadoop command? I couldn't find anything like that. > Thanks for any tips... > > All hadoop logs are stored on the local filesystem on each node (the location is configurable). What you could do though is run a cron job which periodically pushes log files to HDFS at the end of the day. Here's another idea: http://www.cloudera.com/blog/2008/11/02/configuring-and-using-scribe-for-hadoop-log-collection/ -- Harish Mallipeddi http://blog.poundbang.in --000e0cd14690ebd50a046fdb66dc--