Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 47CD710CD2 for ; Fri, 2 May 2014 14:45:07 +0000 (UTC) Received: (qmail 16366 invoked by uid 500); 2 May 2014 14:44:58 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 16134 invoked by uid 500); 2 May 2014 14:44:58 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 16127 invoked by uid 99); 2 May 2014 14:44:57 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 May 2014 14:44:57 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of smarty.juice@gmail.com designates 209.85.219.46 as permitted sender) Received: from [209.85.219.46] (HELO mail-oa0-f46.google.com) (209.85.219.46) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 May 2014 14:44:53 +0000 Received: by mail-oa0-f46.google.com with SMTP id i4so4553456oah.19 for ; Fri, 02 May 2014 07:44:33 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=nNqQBfoM2XfzbwwqIT+/9rJ1Dm+TsKzHAgdBsTb30TM=; b=jY+nY7ZJSZp9cBjC71uHXV437i1ei7nCxslQzBUgk1YDaV0Oxmrl4nsIjdLY/UqvYo rgNc5U/50wPp6GjyRwZJcTBvU8BEd1v3wn/YhoDeGukb8+fZd2diuRIbhJkGzvZGpaiy G9DZaQuSj3J24gL5v2xzAmxapc4tV7rFNkSCHLDG3hrzw/pfGX95NDhg0AuRGG4MXg/W 9wLs2i2EPwWP+s4Wfa0ynkxn2A9vMMtXPfQk3DxFnZ4UPOlMCB73jCd5GlsZBOR1aq3A ZWRQ/uUCpBxz4jPb1pFGZENRWS42Sq7wezsUjMbPrZR/5StJT+QLxbhZtDx715ylQtIV Lhhw== MIME-Version: 1.0 X-Received: by 10.60.40.198 with SMTP id z6mr708393oek.85.1399041872897; Fri, 02 May 2014 07:44:32 -0700 (PDT) Received: by 10.182.74.10 with HTTP; Fri, 2 May 2014 07:44:32 -0700 (PDT) In-Reply-To: References: Date: Fri, 2 May 2014 10:44:32 -0400 Message-ID: Subject: Re: Wordcount file cannot be located From: Hardik Pandya To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=089e013cc1f2dc5ea304f86bd174 X-Virus-Checked: Checked by ClamAV on apache.org --089e013cc1f2dc5ea304f86bd174 Content-Type: text/plain; charset=UTF-8 Please add below to your config - for some reason hadoop-common jar is being overwritten - please share your feedback - thanks config.set("fs.hdfs.impl",org.apache.hadoop.hdfs.DistributedFileSystem.class.getName() On Fri, May 2, 2014 at 12:08 AM, Alex Lee wrote: > I tried to add the code, but seems still not working. > http://postimg.org/image/6c1dat3jx/ > > 2014-05-02 11:56:06,780 WARN [main] util.NativeCodeLoader > (NativeCodeLoader.java:(62)) - Unable to load native-hadoop library > for your platform... using builtin-java classes where applicable > java.io.IOException: No FileSystem for scheme: hdfs > > Also, the eclipse DFS location can reach the /tmp/ but cannot enter the > /user/ > > Any suggestion, thanks. > > alex > > ------------------------------ > From: unmeshabiju@gmail.com > Date: Fri, 2 May 2014 08:43:26 +0530 > Subject: Re: Wordcount file cannot be located > To: user@hadoop.apache.org > > > Try this along with your MapReduce source code > > Configuration config = new Configuration(); > config.set("fs.defaultFS", "hdfs://IP:port/"); > FileSystem dfs = FileSystem.get(config); > Path path = new Path("/tmp/in"); > > Let me know your thoughts. > > > -- > *Thanks & Regards * > > > *Unmesha Sreeveni U.B* > *Hadoop, Bigdata Developer* > *Center for Cyber Security | Amrita Vishwa Vidyapeetham* > http://www.unmeshasreeveni.blogspot.in/ > > > --089e013cc1f2dc5ea304f86bd174 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Please add below to your config - for some reason =
hadoop-common jar is being overwritten - please share your feedback - thank=
s
config.set= ("fs.hdfs.impl",<= /span>org.= apache.hadoop.hdfs.DistributedFileSystem.class.getName()


On Fri,= May 2, 2014 at 12:08 AM, Alex Lee <eliyart@hotmail.com> w= rote:
I tried to add the code, but seems still not working.=
http:= //postimg.org/image/6c1dat3jx/
=C2=A0
2014-05-02 11:56:06,780 WAR= N=C2=A0 [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(= 62)) - Unable to load native-hadoop library for your platform... using buil= tin-java classes where applicable
java.io.IOException: No FileSystem for scheme: hdfs
=C2=A0
Also,=C2= =A0the eclipse DFS location can reach the=C2=A0/tmp/ but cannot=C2=A0enter = the /user/
=C2=A0
Any suggestion, thanks.
=C2=A0
alex
=C2=A0=

From: unmeshabiju@gmail.com
Date: Fri, 2 May 2014 08:43:26 +0530
Subject: Re: Wordcount file cannot = be located
To: user@hadoop.apache.org


Try this along with your MapReduce source code

Configuration config =3D new Config= uration();
config.set(&= quot;fs.defaultFS", "hdfs://IP:port/");
FileSystem dfs =3D File= System.get(config);
Pat= h path =3D new Path("/tmp/in"= ;);

Let me know your thoughts.


--=C2=A0
Thanks & Regards

Unmesha Sreeveni U.B
Hadoop, Bigdata Developer
=
Center for Cyber Security | Amrita Vishwa Vidyap= eetham


<= /div>

--089e013cc1f2dc5ea304f86bd174--