Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 203B1D809 for ; Mon, 3 Sep 2012 12:33:31 +0000 (UTC) Received: (qmail 81593 invoked by uid 500); 3 Sep 2012 12:33:26 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 81350 invoked by uid 500); 3 Sep 2012 12:33:24 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 81341 invoked by uid 99); 3 Sep 2012 12:33:23 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 03 Sep 2012 12:33:23 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=FSL_RCVD_USER,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of visioner.sadak@gmail.com designates 74.125.83.48 as permitted sender) Received: from [74.125.83.48] (HELO mail-ee0-f48.google.com) (74.125.83.48) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 03 Sep 2012 12:33:18 +0000 Received: by eekd41 with SMTP id d41so2155920eek.35 for ; Mon, 03 Sep 2012 05:32:57 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=P22C64qZmn9Jov922xKz+PXUXs2y2RpXi+f/HuP2gTo=; b=saSMFmJ+frZx/vZxbc49Yv2/kbrgApGGcPT74q52kZM6Zlu4yfpd/cXO1vFQ72zr4r KGRPjMALC3GfW+fpXBYUpfNAQ1jOoSolsa0ONctcuRt3BE48WeUPe2XgH72yOQnpPgQF SE3+8C630G6Uy3syI88Hux9rGmKpsztdo9viRt90OwSCnSGptrn2t2UIcqfsD8EReNK6 tXkh+E6CqnpJjMyqPZY/EYA721RasVDYasrpC/GCNFy3pDpQRngaY1ym9Z5spsjGSWkl 8oPph0Zj0afCsvQSpKRBcTUHSR3HUBIeQcIHuqA9f9leZpIqAoW3DAFbzSdV04qcJ7HW 0syw== MIME-Version: 1.0 Received: by 10.14.209.129 with SMTP id s1mr21339866eeo.24.1346675577337; Mon, 03 Sep 2012 05:32:57 -0700 (PDT) Received: by 10.14.220.134 with HTTP; Mon, 3 Sep 2012 05:32:57 -0700 (PDT) In-Reply-To: References: Date: Mon, 3 Sep 2012 18:02:57 +0530 Message-ID: Subject: Re: Integrating hadoop with java UI application deployed on tomcat From: Visioner Sadak To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b603a786a37e004c8cb5764 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b603a786a37e004c8cb5764 Content-Type: text/plain; charset=ISO-8859-1 thanks hemanth i tried adding ext folder conf and extn root folder unable to add xml only but still same problem thanks for the help On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala wrote: > Hi, > > If you are getting the LocalFileSystem, you could try by putting > core-site.xml in a directory that's there in the classpath for the > Tomcat App (or include such a path in the classpath, if that's > possible) > > Thanks > hemanth > > On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak > wrote: > > Thanks steve thers nothing in logs and no exceptions as well i found that > > some file is created in my F:\user with directory name but its not > visible > > inside my hadoop browse filesystem directories i also added the config by > > using the below method > > hadoopConf.addResource( > > "F:/hadoop-0.22.0/conf/core-site.xml"); > > when running thru WAR printing out the filesystem i m getting > > org.apache.hadoop.fs.LocalFileSystem@9cd8db > > when running an independet jar within hadoop i m getting > > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]] > > when running an independet jar i m able to do uploads.... > > > > just wanted to know will i have to add something in my classpath of > tomcat > > or is there any other configurations of core-site.xml that i am missing > > out..thanks for your help..... > > > > > > > > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran > > wrote: > >> > >> > >> well, it's worked for me in the past outside Hadoop itself: > >> > >> > >> > http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup > >> > >> Turn logging up to DEBUG > >> Make sure that the filesystem you've just loaded is what you expect, by > >> logging its value. It may turn out to be file:///, because the normal > Hadoop > >> site-config.xml isn't being picked up > >> > >> > >>> > >>> > >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak > >>> wrote: > >>>> > >>>> but the problem is that my code gets executed with the warning but > file > >>>> is not copied to hdfs , actually i m trying to copy a file from local > to > >>>> hdfs > >>>> > >>>> Configuration hadoopConf=new Configuration(); > >>>> //get the default associated file system > >>>> FileSystem fileSystem=FileSystem.get(hadoopConf); > >>>> // HarFileSystem harFileSystem= new HarFileSystem(fileSystem); > >>>> //copy from lfs to hdfs > >>>> fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new > >>>> Path("/user/TestDir/")); > >>>> > >> > >> > > > --047d7b603a786a37e004c8cb5764 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable thanks hemanth i tried adding ext folder conf and extn root folder=A0=A0 un= able to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala= <yhemanth@gmail.com> wrote:
Hi,

If you are getting the Loc= alFileSystem, you could try by putting
core-site.xml in a directory that= 's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possi= ble)

Thanks
hemant= h

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <visioner.sadak@gmail.com> = wrote:
> Thanks steve thers nothing in logs and no exceptions as well= i found that
> some file is created in my F:\user with directory name but its not vis= ible
> inside my hadoop browse filesystem directories i also added th= e config by
> using the below method
> hadoopConf.addResource(<= br> > "F:/hadoop-0.22.0/conf/core-site.xml");
> when running= thru WAR printing out the filesystem i m getting
> org.apache.hadoop= .fs.LocalFileSystem@9cd8db
> when running an independet jar within ha= doop i m getting
> DFS[DFSClient[clientName=3DDFSClient_296231340, ugi=3Ddell]]
> w= hen running an independet jar i m able to do uploads....
>
> ju= st wanted to know will i have to add something in my classpath of tomcat > or is there any other configurations of core-site.xml that i am missin= g
> out..thanks for your help.....
>
>
>
> On= Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for m= e in the past outside Hadoop itself:
>>
>>
>> http= ://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-compone= nts/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.= java?revision=3D8882&view=3Dmarkup
>>
>> Turn logging up to DEBUG
>> Make sure that th= e filesystem you've just loaded is what you expect, by
>> logg= ing its value. It may turn out to be file:///, because the normal Hadoop >> site-config.xml isn't being picked up
>>
>><= br>>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:0= 8 AM, Visioner Sadak
>>> <visioner.sadak@gmail.com> wrote:
>>>>
>>>> but the problem is that my =A0code get= s executed with the warning but file
>>>> is not copied to h= dfs , actually i m trying to copy a file from local to
>>>> = hdfs
>>>>
>>>> =A0 =A0Configuration hadoopConf=3Dnew = Configuration();
>>>> =A0 =A0 =A0 =A0 //get the default asso= ciated file system
>>>> =A0 =A0 =A0 =A0FileSystem fileSystem= =3DFileSystem.get(hadoopConf);
>>>> =A0 =A0 =A0 =A0// HarFileSystem harFileSystem=3D new HarFi= leSystem(fileSystem);
>>>> =A0 =A0 =A0 =A0 //copy from lfs t= o hdfs
>>>> =A0 =A0 =A0 =A0fileSystem.copyFromLocalFile(new = Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
= >>
>>
>

--047d7b603a786a37e004c8cb5764--