Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D42E1E9B3 for ; Thu, 27 Dec 2012 23:11:56 +0000 (UTC) Received: (qmail 51816 invoked by uid 500); 27 Dec 2012 23:11:52 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 51726 invoked by uid 500); 27 Dec 2012 23:11:51 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 51719 invoked by uid 99); 27 Dec 2012 23:11:51 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Dec 2012 23:11:51 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of dontariq@gmail.com designates 209.85.212.51 as permitted sender) Received: from [209.85.212.51] (HELO mail-vb0-f51.google.com) (209.85.212.51) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Dec 2012 23:11:45 +0000 Received: by mail-vb0-f51.google.com with SMTP id fq11so10133454vbb.38 for ; Thu, 27 Dec 2012 15:11:20 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=quBJpwivsiRT+jh5lwb1qilruI9TESJOigxkL0tIRwU=; b=cEqkihDKWWealDsQkgSpA3FGNp2hmhUdmDoOm3x8DKK02ZDLhknJXB57BwxtdQCwg1 w/fNNvJsd2lfJHFjTBDBtfA1QZm7TmM0OUO0b8GfO/CyHlAVfogbcg5DGmvl1aFKpJWH h8yaZD6Z4LxAexiLaUG7qLUoR8oR5MSdQfmt/WN7BFHMVACW4CKfqCzYgWMcoZ1D4/O0 vWr7A6rH18Gsb0cD/+fhvQd2AvXzUZZwUakPdMy7bUgk9cRlfnZ40omp2cJOQ39me/Lq RntECtu1ArP442oQC6jSamQfcDeRkR9JX0lhHgIWgCWjXCOxRIwSYTmMQY6uRUQmBgfL 3mdg== Received: by 10.220.151.83 with SMTP id b19mr48305883vcw.25.1356649880074; Thu, 27 Dec 2012 15:11:20 -0800 (PST) MIME-Version: 1.0 Received: by 10.58.34.16 with HTTP; Thu, 27 Dec 2012 15:10:40 -0800 (PST) In-Reply-To: References: From: Mohammad Tariq Date: Fri, 28 Dec 2012 04:40:40 +0530 Message-ID: Subject: Re: setting hadoop for pseudo distributed mode. To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=f46d0434bf1e2fba9804d1ddaa11 X-Virus-Checked: Checked by ClamAV on apache.org --f46d0434bf1e2fba9804d1ddaa11 Content-Type: text/plain; charset=ISO-8859-1 what are those libraries and how are they reading data from HDFS? you were trying with MR jobs if i'm not wrong? in order to perform read/write on HDFS we need HDFS API with a Configuration object. how are you doing it here? Best Regards, Tariq +91-9741563634 https://mtariq.jux.com/ On Fri, Dec 28, 2012 at 2:38 AM, jamal sasha wrote: > Hi, > Thanks for throwing insight. > > So the code snippet looks like this > > String interout = final_output + "/intermediate"; > > try { > new CreateInterOutput().main(new String[] { input, interout }); > } catch (Exception e) { > e.printStackTrace(); > return; > } > > try { > new CreateFinalOutput().main(new String[] { interout, final_output }); > > } catch (Exception e) { > e.printStackTrace(); > return; > } > > Any suggestions where it might be faltering? > > > On Thu, Dec 27, 2012 at 12:49 PM, Mohammad Tariq wrote: > >> Hello Jamal, >> >> Please find my commands embedded below : >> >> Q1) How did putting those two lines solved the issue?? >> >> By adding those two resources you make sure that your code looks for >> the input path inside HDFS, which would otherwise look for it in the local >> FS by default. The files core-site.xml and hdfs-site.xml tell your code >> where to go for NN and DN. >> >> Q2) >> I am now using third party libraries which are taking input from hdfs >> and writing output to hdfs... >> But in an intermediatory step, it creates a raw output.. I am again >> getting the error: >> ERROR security.UserGroupInformation: PriviledgedActionException >> as:mhduser cause:org.apache.hadoop.mapred.InvalidInputException: Input path >> does not exist: hdfs://localhost:54310/user/hduser/wiki-inter-output >> How do I resolve this? >> >>If you are able to compile your code properly then there is no problem >> with the third party libraries which you are using. It looks like to me >> that your code doesn't have the proper info about the intermediate path. >> Please make sure you have told your code the exact location of intermediate >> output. >> >> >> Best Regards, >> Tariq >> +91-9741563634 >> https://mtariq.jux.com/ >> >> >> On Fri, Dec 28, 2012 at 1:33 AM, jamal sasha wrote: >> >>> Hi, >>> So I am still in process of learning hadoop. >>> I tried to run wordcount.java (by writing my own mapper reducer.. >>> creating jar and then running it in a pseudo distributed mode). >>> >>> At that time I got an error, something like >>> ERROR security.UserGroupInformation: PriviledgedActionException >>> as:mhduser cause:org.apache.hadoop.mapred.InvalidInputException: Input path >>> does not exist: hdfs://localhost:54310/user/hduser/wiki >>> So I googled around.. and found that I should put the following two >>> lines on my driver code: >>> >>> conf.addResource(new Path("/usr/local/hadoop/conf/core-site.xml")); >>> conf.addResource(new Path("/usr/local/hadoop/conf/hdfs-site.xml >>> file")); >>> >>> (path of where my core-site and hdfs-site are) and after that it ran >>> just fine. >>> >>> Q1) How did putting those two lines solved the issue?? >>> Q2) >>> I am now using third party libraries which are taking input from hdfs >>> and writing output to hdfs... >>> But in an intermediatory step, it creates a raw output.. I am again >>> getting the error: >>> ERROR security.UserGroupInformation: PriviledgedActionException >>> as:mhduser cause:org.apache.hadoop.mapred.InvalidInputException: Input path >>> does not exist: hdfs://localhost:54310/user/hduser/wiki-inter-output >>> How do I resolve this? >>> >>> Any suggestions. >>> THanks >>> Jamal. >>> >> >> > --f46d0434bf1e2fba9804d1ddaa11 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
what are those libraries and how are they reading data fro= m HDFS? you were trying with MR jobs if i'm not wrong? in order to perf= orm read/write on HDFS we need HDFS API with a Configuration object. how ar= e you doing it here?

Best Reg= ards,
Tariq
+91-9741563634


On Fri, Dec 28, 2012 at 2:38 AM, jamal s= asha <jamalshasha@gmail.com> wrote:
Hi,
=A0 Thanks for throwing insight.

So the code snippet looks like this

S= tring interout =3D final_output + "/intermediate";
=A0 =A0=A0
=A0 =A0 try {
=A0 =A0 =A0 new CreateInt= erOutput().main(new String[] { input, interout });
=A0 =A0 } catc= h (Exception e) {
=A0 =A0 =A0 e.printStackTrace();
=A0 = =A0 =A0 return;
=A0 =A0 }

=A0 =A0 try {
=A0 =A0 new CreateFinalOutput()= .main(new String[] { interout, final_output });

= =A0 =A0 } catch (Exception e) {
=A0 =A0 =A0 e.printStackTrace();<= /div>
=A0 =A0 =A0 return;
=A0 =A0 }

Any suggestions where it might be f= altering?


On Thu, Dec 27, 2012 = at 12:49 PM, Mohammad Tariq <dontariq@gmail.com> wrote:
Hello Jamal,

=
=A0 =A0Please find my commands embedded below :

Q1) = How did putting those two lines solved the issue??
>> By adding those two resources you make sure that your code looks= for the input path inside HDFS, which would otherwise look for it in the l= ocal FS by default. The files core-site.xml and hdfs-site.xml tell your cod= e where to go for NN and DN.

Q2)
=A0 I am now using third part= y libraries which are taking input from hdfs and writing output to hdfs...= =A0
But in an interm= ediatory step, it creates a raw output.. I am again getting the error:
=A0ERROR securi= ty.UserGroupInformation: PriviledgedActionException as:mhduser cause:org.ap= ache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs:/= /localhost:54310/user/hduser/wiki-inter-output
How do I r= esolve this?
>>If you are able to compile your code properly then there is= no problem with the third party libraries which you are using. It looks li= ke to me that your code doesn't have the proper info about the intermed= iate path. Please make sure you have told your code the exact location of i= ntermediate output.

=

Best Regard= s,


On Fri, Dec 28, 2012 at 1:33 AM, jamal s= asha <jamalshasha@gmail.com> wrote:
Hi,
=A0 So I am still in process of learning hadoop.
I tried to run wordcount.java (by writing my own mapper reducer.. = creating jar and then running it in a pseudo distributed mode).

At that time I got an error, something like=A0
=A0ERROR security.UserGroupInformation: PriviledgedActionException as:mhd= user cause:org.apache.hadoop.mapred.InvalidInputException: Input path does = not exist: hdfs://localhost:54310/user/hduser/wiki
So I googled around.. and found that I should put the following = two lines on my driver code:

=A0 =A0 conf.add= Resource(new Path("/usr/local/hadoop/conf/core-site.xml"));
=A0 =A0 conf.addResource(new Path("/usr/local/hadoop/conf/hdfs-si= te.xml file"));

(path of where my core-site a= nd hdfs-site are) and after that it ran just fine.

Q1) How did putting those two lines solved the issue??
Q2)
=A0 I am now using third party libraries which are taking in= put from hdfs and writing output to hdfs...=A0
But in an intermediatory step, it creates a raw output.. I am again getting= the error:
=A0ERROR security.UserGroupInformation: PriviledgedAc= tionException as:mhduser cause:org.apache.hadoop.mapred.InvalidInputExcepti= on: Input path does not exist: hdfs://localhost:54310/user/hduser/wiki-inte= r-output
How do I resolve this?

Any suggestions.=
THanks
Jamal.



--f46d0434bf1e2fba9804d1ddaa11--