Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0F233D495 for ; Fri, 18 Jan 2013 04:12:11 +0000 (UTC) Received: (qmail 45019 invoked by uid 500); 18 Jan 2013 04:12:05 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 44599 invoked by uid 500); 18 Jan 2013 04:12:04 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 44553 invoked by uid 99); 18 Jan 2013 04:12:02 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 18 Jan 2013 04:12:02 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dontariq@gmail.com designates 209.85.220.181 as permitted sender) Received: from [209.85.220.181] (HELO mail-vc0-f181.google.com) (209.85.220.181) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 18 Jan 2013 04:11:50 +0000 Received: by mail-vc0-f181.google.com with SMTP id d16so1006882vcd.40 for ; Thu, 17 Jan 2013 20:11:29 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=4yJByoXZevRqiozDB69vbW7Lev+s417PGifsXogP1is=; b=QtZNCV3uHWivjpBLbQg6606B8REGWht9BHbQcqdo+1cr0NZcyzF6cV3T1s5dx/vClj iVvGftR7MCo8Krxgvnp9ZCmgjDrsKm1tUdsZ2cap6pzFwdVEFOdIEQyot0zb3HXMnfCZ l04UpIbx+WYMOtfDU4FfMjfnhuKafWiw6ELFF2TO5pamQJex2zuBl2nULt3EGhR8mji2 7Rg37SLOc9s781d6nDxMcBvFXJq/V4AKG+18appXAk0qZ2xsjiCJAiwNGj5kQz3BkaGj 2e/XJP6VRx/k+6KGjv4h27PEoMFuxAapflFZBG76an+19u1HZKBOm2tuYPcd3PBLwWBB WXnw== X-Received: by 10.52.156.7 with SMTP id wa7mr7148392vdb.46.1358482289648; Thu, 17 Jan 2013 20:11:29 -0800 (PST) MIME-Version: 1.0 Received: by 10.58.34.16 with HTTP; Thu, 17 Jan 2013 20:10:49 -0800 (PST) In-Reply-To: References: From: Mohammad Tariq Date: Fri, 18 Jan 2013 09:40:49 +0530 Message-ID: Subject: Re: Program trying to read from local instead of hdfs To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=bcaec53aec044ee55f04d3884e29 X-Virus-Checked: Checked by ClamAV on apache.org --bcaec53aec044ee55f04d3884e29 Content-Type: text/plain; charset=ISO-8859-1 Not an issue :) Warm Regards, Tariq https://mtariq.jux.com/ cloudfront.blogspot.com On Fri, Jan 18, 2013 at 9:38 AM, jamal sasha wrote: > Hi, > Just saw your email. I was so tired with this issue that the moment it > ran, I took a time off. I will get back to you soon :) > thanks > > > On Thu, Jan 17, 2013 at 5:04 PM, Mohammad Tariq wrote: > >> Which Hadoop version are you using? That post is quite old. >> Try the same same thing using the new API. Also, modify the >> above 2 lines to : >> >> conf.addResource(new >> File("/hadoop/projects/hadoop-1.0.4/conf/core-site.xml").getAbsoluteFile().toURI().toURL()); >> conf.addResource(new >> File("/hadoop/projects/hadoop-1.0.4/conf/hdfs-site.xml").getAbsoluteFile().toURI().toURL()); >> >> Sometimes if the resource is added as a String and not as s URL, >> it is interpreted as a classpath resource, which I think is not getting >> resolved here. >> >> Let me know if this works. Just to confirm whether I am correct >> or not. >> >> Warm Regards, >> Tariq >> https://mtariq.jux.com/ >> cloudfront.blogspot.com >> >> >> On Fri, Jan 18, 2013 at 5:48 AM, jamal sasha wrote: >> >>> >>> On Thu, Jan 17, 2013 at 4:14 PM, Mohammad Tariq wrote: >>> >>>> hdfs://your_namenode:9000/user/hduser/data/input1.txt >>>> >>> >>> It runs :D >>> But I am very curious. if i run the sample wordcount example normally .. >>> it automatically reads from the hdfs location.. >>> but here.. it didnt seemed to respect that? >>> >> >> > --bcaec53aec044ee55f04d3884e29 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Not an issue :)



On Fri, Jan 18, 2013 at 9:38 AM, jamal s= asha <jamalshasha@gmail.com> wrote:
Hi,
=A0Just saw your email. I was so tired with this i= ssue that the moment it ran, I took a time off. I will get back to you soon= :)
thanks


On Thu, Jan 17, 2013 at 5:04 PM, Mohammad Tariq = <dontariq@gmail.com> wrote:
Which Hadoop version are you using? That post is quite old= .=A0
Try the same same thing using the new API. Also, modify the
<= div>above 2 lines to :

conf.addResource(new F= ile("/hadoop/projects/hadoop-1.0.4/conf/core-site.xml").getAbsolu= teFile().toURI().toURL());
conf.addResource(new File("/hadoop/projects/hadoop-1.0.4/conf/hdf= s-site.xml").getAbsoluteFile().toURI().toURL());

Sometimes if the resource i= s added as a String and not as s URL,=A0
it is interpreted as a classpath resource, which= I think is not getting
resolved here.

Let me know if this works. Just to confirm whether I am correct
or not.
On Fri, Jan 18, 2013 at = 5:48 AM, jamal sasha <jamalshasha@gmail.com> wrote:

= On Thu, Jan 17, 2013 at 4:14 PM, Mohammad Tariq <dontariq@gmail.com&g= t; wrote:
hdfs://your_namenode:9000/user/hduser/data/input1.txt

It runs :D
But I am = very curious. if i run the sample wordcount example normally .. it automati= cally reads from the hdfs location..
but here.. it didnt seemed to respect that?



--bcaec53aec044ee55f04d3884e29--