Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0BEDBD4BE for ; Thu, 30 Aug 2012 16:28:16 +0000 (UTC) Received: (qmail 68736 invoked by uid 500); 30 Aug 2012 16:28:10 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 68628 invoked by uid 500); 30 Aug 2012 16:28:10 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 68619 invoked by uid 99); 30 Aug 2012 16:28:10 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 30 Aug 2012 16:28:10 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=FSL_RCVD_USER,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.220.176] (HELO mail-vc0-f176.google.com) (209.85.220.176) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 30 Aug 2012 16:28:03 +0000 Received: by vcbfl11 with SMTP id fl11so2685147vcb.35 for ; Thu, 30 Aug 2012 09:27:42 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type:x-gm-message-state; bh=744DtD+UWSxpMkzob1G5D5pmD682BBW3NMdgDUuh/OA=; b=FIiMnckBu5CUxp5JrdTfPzu1dWHrQZ3q0i+fyGCc0bOHmc6lu7ukP2LU0CWAImNKs9 VGisD7Ir4+VqWx3KWUh+6lbim6cLRwdlL8n5Fiv/wdLUsNp4aiXiNE8J7xB2U4QMK6ua 9Em04K65cSWfj3xmsvZS6Xy///0UI57Sq2r1bre/J65EvYGI2LcAKLTzZRqOFCHRhiQU 0bS9Les6WxLsohjkoL3w0fxwoCdX77NRtWV/bos7qO+5RPwIUyEQs9KbOjqdhrxSVoZL sngHc0PUU+nZEVAsoKc6Woax5LJJwZ5U9kt9CXU99NqSuJ7Nm2SDyG+EePHnB+7u12bJ TCig== MIME-Version: 1.0 Received: by 10.52.32.99 with SMTP id h3mr2926908vdi.108.1346344062577; Thu, 30 Aug 2012 09:27:42 -0700 (PDT) Received: by 10.52.64.235 with HTTP; Thu, 30 Aug 2012 09:27:42 -0700 (PDT) In-Reply-To: References: Date: Thu, 30 Aug 2012 17:27:42 +0100 Message-ID: Subject: Re: Integrating hadoop with java UI application deployed on tomcat From: Steve Loughran To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec51d28ba986af904c87e277e X-Gm-Message-State: ALoCoQmUw+AQThMFQMJvknCYpmpvd8gegSncCBzYiH8Tt0BMSdH0f2ang/1yUdv8tW15+BW07WhO --bcaec51d28ba986af904c87e277e Content-Type: text/plain; charset=UTF-8 On 30 August 2012 13:54, Visioner Sadak wrote: > Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar & > commons-lang-2.1.jar to get rid of the class not found error now i am > getting this error is this becoz i am using my app and hadoop on windows??? > > util.NativeCodeLoader: Unable to load native-hadoop library for your > platform... using builtin-java classes where applicable > no, that's warning you that the native code to help with some operations (especially compression) aren't loading as your JVM's native lib path aren't set up right. Just edit log4j to hide that classes log messages. FWIW, I've downgraded some other messages that are over noisy, especially if you bring up a MiniMR/MiniDFS cluster for test runs: log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL log4j.logger.org.apache.hadoop.metrics2=FATAL log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING log4j.logger.org.apache.hadoop.ipc.Server=WARNING log4j.logger.org.apache.hadoop.metrics=FATAL > > > > > On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran wrote: > >> you will need almost the entire hadoop client-side JAR set and >> dependencies for this, I'm afraid. >> >> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight >> and only need an HTTP client, but I'm not aware of any ultra-thin client >> yet (apache http components should suffice). >> >> If you are using any of the build tools with dependency management: >> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies >> pulled in. >> >> If you aren't using any of the build tools w/ dependency management, now >> is the time. >> >> >> On 30 August 2012 09:32, Visioner Sadak wrote: >> >>> Hi, >>> >>> I have a WAR which is deployed on tomcat server the WAR contains some >>> java classes which uploads files, will i be able to upload directly in to >>> hadoop iam using the below code in one of my java class >>> >>> Configuration hadoopConf=new Configuration(); >>> //get the default associated file system >>> FileSystem fileSystem=FileSystem.get(hadoopConf); >>> // HarFileSystem harFileSystem= new HarFileSystem(fileSystem); >>> //copy from lfs to hdfs >>> fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new >>> Path("/user/TestDir/")); >>> >>> but its throwing up this error >>> >>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration >>> >>> when this code is run independtly using a single jar deployed in hadoop >>> bin it wrks fine >>> >>> >>> >> >> > --bcaec51d28ba986af904c87e277e Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

On 30 August 2012 13:54, Visioner Sadak = <visioner.sadak@gmail.com> wrote:
Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &= commons-lang-2.1.jar to get rid of the class not found error now i am gett= ing this error is this becoz i am using my app and hadoop on windows???

util.NativeCodeLoader: Unable to load native-hadoop library for your pla= tform... using builtin-java classes where applicable

=

no, that's warning you that the native code to help= with some operations (especially compression) aren't loading as your J= VM's native lib path aren't set up right.=C2=A0

Just edit log4j to hide that classes log messages.

FWIW, I've downgraded some other messages that are= over noisy, especially if you bring up a MiniMR/MiniDFS cluster for test r= uns:

log4j.logger.org.apache.hadoop.hdfs.server.datanod= e.DataStorage=3DWARNING
log4j.logger.org.apache.hadoop.hdfs.serve= r.datanode.DataXceiverServer=3DWARNING
log4j.logger.org.apache.ha= doop.hdfs.server.datanode.DataBlockScanner=3DWARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=3DFATAL<= /div>
log4j.logger.org.apache.hadoop.metrics2=3DFATAL
log4j.l= ogger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=3DWARNING
= log4j.logger.org.apache.hadoop.ipc.Server=3DWARNING
log4j.logger.org.apache.hadoop.metrics=3DFATAL

=C2=A0
=C2=A0


=C2=A0
On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran = <stevel@hortonworks.com> wrote:
you will need almost the entire hadoo= p client-side JAR set and dependencies for this, I'm afraid.=20

The new webhdfs filesys -HDFS over HTTP- is designed to be lighter wei= ght and only need an HTTP client, but I'm not aware of any ultra-thin c= lient yet (apache http components should suffice).=C2=A0

If you are using any of the build tools with dependency management: An= t+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies pulled = in.

If you aren't using any of the build tools w/ dependency managemen= t, now is the time.=C2=A0=20


On 30 August 2012 09:32, Visioner Sadak <visioner.sadak@gmail.com> wrote:
Hi,
=C2=A0
=C2=A0 I have a WAR which is deployed on tomcat server the WAR contain= s some java classes which uploads files, will i be able to upload directly = in to hadoop iam using the below code in one of my java class
=C2=A0
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Configuration hadoopConf=3Dnew Co= nfiguration();
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 //get the defa= ult associated file system
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 FileSyst= em fileSystem=3DFileSystem.get(hadoopConf);=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 // HarFileSystem harF= ileSystem=3D new HarFileSystem(fileSystem);
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 //copy from lfs to hdfs
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 fileSystem.copyFromLocalFile(new Path(&qu= ot;E:/test/GANI.jpg"),new Path("/user/TestDir/"));
=C2=A0
but its throwing up this error

java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

when this code is run independtly using a single jar deployed in hadoop = bin it wrks fine

=C2=A0




--bcaec51d28ba986af904c87e277e--