hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Visioner Sadak <visioner.sa...@gmail.com>
Subject Re: Integrating hadoop with java UI application deployed on tomcat
Date Wed, 05 Sep 2012 13:42:28 GMT
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <visioner.sadak@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <bejoy.hadoop@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <visioner.sadak@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<user@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <visioner.sadak@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Mime
View raw message