Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5967DE76F for ; Thu, 17 Jan 2013 10:40:31 +0000 (UTC) Received: (qmail 15096 invoked by uid 500); 17 Jan 2013 10:40:26 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 14795 invoked by uid 500); 17 Jan 2013 10:40:25 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 14769 invoked by uid 99); 17 Jan 2013 10:40:24 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 17 Jan 2013 10:40:24 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of balijamahesh.mca@gmail.com designates 209.85.217.169 as permitted sender) Received: from [209.85.217.169] (HELO mail-lb0-f169.google.com) (209.85.217.169) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 17 Jan 2013 10:40:18 +0000 Received: by mail-lb0-f169.google.com with SMTP id m4so526836lbo.0 for ; Thu, 17 Jan 2013 02:39:56 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=xkyXh1xBsWPVy/aMWvLFzjA1bSUAPljDIrCTOUtNGGE=; b=Mu3O4fq8oervuj8mThy+yC4f2Q1KtRY8H64fKc4ODt89VBiwvAI0nxA198W3kKWjgN yH+WkkB2r14wNlzVhprvUIMxvg0gKPfmzfY1SrW3xQAp1iyj7DLPIu30c6a1ZAGZMJUP MqreheJiKfEKVDOTVyNL6XXNkkDglXvxFMlL6yTRFZ+TAKDOsuU/E4ohlJUuVMSl7Fas wgFwF8EQ4Le5/lNoM96Kadq3xYaX0JgnFPPMJEG9qnL01nIAagDiq44lG2xtZxlcV3Ia kl8Gocotsc994Q9jHKg93yPFXKYZSqqKh2UjTwwFJL2Ls1+ZXvgjKne0IPG6XmPDcsQ9 8qpQ== MIME-Version: 1.0 X-Received: by 10.152.110.18 with SMTP id hw18mr4385410lab.22.1358419196620; Thu, 17 Jan 2013 02:39:56 -0800 (PST) Received: by 10.112.74.198 with HTTP; Thu, 17 Jan 2013 02:39:56 -0800 (PST) In-Reply-To: References: Date: Thu, 17 Jan 2013 16:09:56 +0530 Message-ID: Subject: Re: How to copy log files from remote windows machine to Hadoop cluster From: Mahesh Balija To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec54d4de8abb13304d3799da1 X-Virus-Checked: Checked by ClamAV on apache.org --bcaec54d4de8abb13304d3799da1 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable I have studied Flume but I didn't find any thing useful in my case. My requirement is there is a directory in Windows machine, in which the files will be generated and keep updated with new logs. I want to have a tail kind of mechanism (using exec source) through which I can push the latest updates into the cluster. Or I have to simply push once in a day to the cluster using spooling directory mechanism. Can somebody assist whether it is possible using Flume if so the configurations needed for this specific to remote windows machine. But On Thu, Jan 17, 2013 at 3:48 PM, Mirko K=E4mpf wro= te: > Give Flume (http://flume.apache.org/) a chance to collect your data. > > Mirko > > > > 2013/1/17 sirenfei > >> ftp auto upload? >> >> >> 2013/1/17 Mahesh Balija : >> > the Hadoop cluster (HDFS) either in synchronous or asynchronou >> > > --bcaec54d4de8abb13304d3799da1 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable I have studied Flume but I didn't find any thing useful in my case.
= My requirement is there is a directory in Windows machine, in which the fil= es will be generated and keep updated with new logs. I want to have a tail = kind of mechanism (using exec source) through which I can push the latest u= pdates into the cluster.
Or I have to simply push once in a day to the cluster using spooling direct= ory mechanism.

Can somebody assist whether it is possible using Flum= e if so the configurations needed for this specific to remote windows machi= ne.

But

On Thu, Jan 17, 2013 at 3:48 PM,= Mirko K=E4mpf <mirko.kaempf@gmail.com> wrote:
Give Flume (http://f= lume.apache.org/) a chance to collect your data.=

Mirko



2013/1/17 sirenfei <sirenxue@gmail= .com>
ftp auto upload?


2013/1/17 Mahesh Balija <balijamahesh.mca@gmail.com>:
> the Hadoop cluster (HDFS) either in synchronous or asynchron= ou


--bcaec54d4de8abb13304d3799da1--