hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@hortonworks.com>
Subject Re: Integrating hadoop with java UI application deployed on tomcat
Date Thu, 30 Aug 2012 16:27:42 GMT
On 30 August 2012 13:54, Visioner Sadak <visioner.sadak@gmail.com> wrote:

> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
> commons-lang-2.1.jar to get rid of the class not found error now i am
> getting this error is this becoz i am using my app and hadoop on windows???
> util.NativeCodeLoader: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable

no, that's warning you that the native code to help with some operations
(especially compression) aren't loading as your JVM's native lib path
aren't set up right.

Just edit log4j to hide that classes log messages.

FWIW, I've downgraded some other messages that are over noisy, especially
if you bring up a MiniMR/MiniDFS cluster for test runs:


> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <stevel@hortonworks.com>wrote:
>> you will need almost the entire hadoop client-side JAR set and
>> dependencies for this, I'm afraid.
>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
>> and only need an HTTP client, but I'm not aware of any ultra-thin client
>> yet (apache http components should suffice).
>> If you are using any of the build tools with dependency management:
>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>> pulled in.
>> If you aren't using any of the build tools w/ dependency management, now
>> is the time.
>> On 30 August 2012 09:32, Visioner Sadak <visioner.sadak@gmail.com> wrote:
>>> Hi,
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>> but its throwing up this error
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine

View raw message