Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id B8E3EEABE for ; Tue, 19 Feb 2013 17:46:54 +0000 (UTC) Received: (qmail 29723 invoked by uid 500); 19 Feb 2013 17:46:50 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 29647 invoked by uid 500); 19 Feb 2013 17:46:49 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 29638 invoked by uid 99); 19 Feb 2013 17:46:49 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 19 Feb 2013 17:46:49 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of harsh@cloudera.com designates 209.85.223.181 as permitted sender) Received: from [209.85.223.181] (HELO mail-ie0-f181.google.com) (209.85.223.181) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 19 Feb 2013 17:46:45 +0000 Received: by mail-ie0-f181.google.com with SMTP id 17so8682516iea.26 for ; Tue, 19 Feb 2013 09:46:25 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type:x-gm-message-state; bh=C3RmG4pm08vCy05d4Z/F5oykoei+W3GGOyWuN6UAnwg=; b=TFOxtOnvOh3hYwK4extZr731VXVveK71kv3lsX6B8n60XUGGW4mwx6pqdxhoQRIAa6 LsHziabYkrsLbJZSlw792sHx+9aAGZIe15CGQWoPAnJTbf8g8cAK2Ip51HL2p4n1TbHZ qsbHlt8Oa+RmaYQzsYAiwfj7tSn4Uo5ik6wBZThSoo+/C4IsKhPypXeqaDqU6ixrh/hE 2MuU6596kM/aBZ4EWn5JWI3fMbf0PSvM19sIit25AMLqxg67JGPHQpBAm6JFG6ZccMFJ gozJsFIsa0YSr9f+stbosg7nEjWkIiVD5Uxhqk/9aqsg2lYVISJP6NrwnlzxqfByjp4F 2wwA== X-Received: by 10.50.37.239 with SMTP id b15mr9157005igk.69.1361295971371; Tue, 19 Feb 2013 09:46:11 -0800 (PST) MIME-Version: 1.0 Received: by 10.50.104.229 with HTTP; Tue, 19 Feb 2013 09:45:51 -0800 (PST) In-Reply-To: References: From: Harsh J Date: Tue, 19 Feb 2013 23:15:51 +0530 Message-ID: Subject: Re: Trouble in running MapReduce application To: "" Content-Type: text/plain; charset=ISO-8859-1 X-Gm-Message-State: ALoCoQmJnBAX9bKPnX8NE5csddpNVE3wYKaSsxmnic7fC3lE9M84E5PlKqlWE2YqoKI1nYJ1ZmcC X-Virus-Checked: Checked by ClamAV on apache.org Oops. I just noticed Hemanth has been answering on a dupe thread as well. Lets drop this thread and carry on there :) On Tue, Feb 19, 2013 at 11:14 PM, Harsh J wrote: > Hi, > > The new error usually happens if you compile using Java 7 and try to > run via Java 6 (for example). That is, an incompatibility in the > runtimes for the binary artifact produced. > > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas wrote: >> Thank you very much Harsh, >> >> Now, as I promised earlier I am much obliged to you. >> >> But, now I solved that problem by just changing the directories then again >> creating a jar file of org. but I am getting this error: >> >> 1.) What I got >> ------------------------------------------------------------------------------ >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out >> Warning: $HADOOP_HOME is deprecated. >> >> Exception in thread "main" java.lang.UnsupportedClassVersionError: >> org/myorg/MapReduce : Unsupported major.minor version 51.0 >> at java.lang.ClassLoader.defineClass1(Native Method) >> at java.lang.ClassLoader.defineClass(ClassLoader.java:634) >> at >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) >> at java.net.URLClassLoader.defineClass(URLClassLoader.java:277) >> at java.net.URLClassLoader.access$000(URLClassLoader.java:73) >> at java.net.URLClassLoader$1.run(URLClassLoader.java:212) >> at java.security.AccessController.doPrivileged(Native Method) >> at java.net.URLClassLoader.findClass(URLClassLoader.java:205) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:321) >> at java.lang.ClassLoader.loadClass(ClassLoader.java:266) >> at java.lang.Class.forName0(Native Method) >> at java.lang.Class.forName(Class.java:266) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:149) >> >> 2.) How I create my jar >> ------------------------------------------------------------------------------------- >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org >> added manifest >> adding: org/(in = 0) (out= 0)(stored 0%) >> adding: org/myorg/(in = 0) (out= 0)(stored 0%) >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out= >> 690)(deflated 58%) >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%) >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out= >> 823)(deflated 56%) >> >> 3.) Content of my jar file >> --------------------------------------------------------------------------------------- >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar >> META-INF/ >> META-INF/MANIFEST.MF >> org/ >> org/myorg/ >> org/myorg/MapReduce$FlowPortReducer.class >> org/myorg/MapReduce.class >> org/myorg/MapReduce$FlowPortMapper.class >> ----------------------------------------------------------------------------------------- >> >> >> Thank you very much. >> >> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J wrote: >>> >>> Your point (4) explains the problem. The jar packed structure should >>> look like the below, and not how it is presently (one extra top level >>> dir is present): >>> >>> META-INF/ >>> META-INF/MANIFEST.MF >>> org/ >>> org/myorg/ >>> org/myorg/WordCount.class >>> org/myorg/WordCount$TokenizerMapper.class >>> org/myorg/WordCount$IntSumReducer.class >>> >>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas >>> wrote: >>> > Hi everyone, >>> > >>> > I know this is the common mistake to not specify the class adress while >>> > trying to run a jar, however, >>> > although I specified, I am still getting the ClassNotFound exception. >>> > >>> > What may be the reason for it? I have been struggling for this problem >>> > more >>> > than a 2 days. >>> > I just wrote different MapReduce application for some anlaysis. I got >>> > this >>> > problem. >>> > >>> > To check, is there something wrong with my system, i tried to run >>> > WordCount >>> > example. >>> > When I just run hadoop-examples wordcount, it is working fine. >>> > >>> > But when I add just "package org.myorg;" command at the beginning, it >>> > doesnot work. >>> > >>> > Here is what I have done so far >>> > >>> > ************************************************************************* >>> > 1. I just copied wordcount code from the apaches own examples source >>> > code >>> > and I just changed package decleration as "package org.myorg;" >>> > >>> > ************************************************************************** >>> > 2. Then I tried to run that command: >>> > >>> > ************************************************************************* >>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount >>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output" >>> > >>> > ************************************************************************* >>> > 3. I got following error: >>> > >>> > ************************************************************************** >>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar >>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow >>> > 19_02_wordcount.output >>> > Warning: $HADOOP_HOME is deprecated. >>> > >>> > Exception in thread "main" java.lang.ClassNotFoundException: >>> > org.myorg.WordCount >>> > at java.net.URLClassLoader$1.run(URLClassLoader.java:217) >>> > at java.security.AccessController.doPrivileged(Native Method) >>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:205) >>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:321) >>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:266) >>> > at java.lang.Class.forName0(Native Method) >>> > at java.lang.Class.forName(Class.java:266) >>> > at org.apache.hadoop.util.RunJar.main(RunJar.java:149) >>> > >>> > >>> > ************************************************************************** >>> > 4. This is the content of my .jar file: >>> > **************************************************** >>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar >>> > META-INF/ >>> > META-INF/MANIFEST.MF >>> > wordcount_classes/ >>> > wordcount_classes/org/ >>> > wordcount_classes/org/myorg/ >>> > wordcount_classes/org/myorg/WordCount.class >>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class >>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class >>> > ********************************************************** >>> > 5. This is the 'ls' output of my working directory: >>> > ********************************************************** >>> > [hadoop@ADUAE042-LAP-V project]$ ls >>> > flowclasses_18_02 flowclasses_18_02.jar hadoop-1.0.4 >>> > hadoop-1.0.4.tar.gz >>> > hadoop-data MapReduce.java sample wordcount_19_02.jar >>> > wordcount_classes >>> > WordCountClasses WordCount.java >>> > ************************************************************* >>> > So as you see, package decleration is fine but I am really helpless, I >>> > googled but they are all saying samething you should specify the package >>> > hierarchy of your main class. I did know it already I am specifying but >>> > doesn't work. >>> > >>> > I would be much obliged to anyone helped me >>> > >>> > Regards, >>> >>> >>> >>> -- >>> Harsh J >> >> > > > > -- > Harsh J -- Harsh J