hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Visioner Sadak <visioner.sa...@gmail.com>
Subject Re: Integrating hadoop with java UI application deployed on tomcat
Date Mon, 03 Sep 2012 17:53:37 GMT
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <visioner.sadak@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yhemanth@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath
of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <visioner.sadak@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with
the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy
a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Mime
View raw message