Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id F21E7109FB for ; Mon, 28 Jul 2014 07:42:13 +0000 (UTC) Received: (qmail 71246 invoked by uid 500); 28 Jul 2014 07:42:09 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 71133 invoked by uid 500); 28 Jul 2014 07:42:09 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 71123 invoked by uid 99); 28 Jul 2014 07:42:08 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 28 Jul 2014 07:42:08 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of harsh@cloudera.com designates 209.85.213.171 as permitted sender) Received: from [209.85.213.171] (HELO mail-ig0-f171.google.com) (209.85.213.171) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 28 Jul 2014 07:42:06 +0000 Received: by mail-ig0-f171.google.com with SMTP id l13so3305046iga.10 for ; Mon, 28 Jul 2014 00:41:41 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:content-type; bh=16IAXMxNHuSvrsIB52kmqa+mr6E2TasfUi9Srt5e6K4=; b=KbEfhmd7u1Ve3Nn76VcPJ7qDF/az052tBVoZ9jLfMJgZgANHSdhqC9qFwZKSbsybnZ v+Z57uLEPqva+qhBqDZRMA88mQ/Xx+TrLKhCzQUuyXzgLgLvQc7v0zaSVJsHLGVJPVHd j+IFFvaQ5gUCIxg1W0DZEof8GEb1qr6IQktX3G0lilmucOqYOQSDbYURbThtxSMd2YZE uNzNc3WYV3u75rqkgEht0oAT5FaSD1X1OKUTkEbP4Uvd9Zxca8YNB45hOvkt+85Np0d8 8j59eynG6qM97bIQ+P1yTWtiP2xZBI+kp0XzB28qWKa3gTKa5gRKAWiRSXSDf3OIkcHI cqBQ== X-Gm-Message-State: ALoCoQnVe0RkErXAzy9/aTr3f/dWSFv6Xdxv4wRWz/EpqfrDjpptcWVFyQuAYLGJepQVxz/vv0Im X-Received: by 10.50.78.167 with SMTP id c7mr29162562igx.6.1406533301570; Mon, 28 Jul 2014 00:41:41 -0700 (PDT) MIME-Version: 1.0 Received: by 10.50.89.165 with HTTP; Mon, 28 Jul 2014 00:41:21 -0700 (PDT) In-Reply-To: <1406526714.79565.YahooMailNeo@web122406.mail.ne1.yahoo.com> References: <1406508417.62142.YahooMailNeo@web122406.mail.ne1.yahoo.com> <1406526714.79565.YahooMailNeo@web122406.mail.ne1.yahoo.com> From: Harsh J Date: Mon, 28 Jul 2014 13:11:21 +0530 Message-ID: Subject: Re: Cannot compaile a basic PutMerge.java program To: "" , R J Content-Type: text/plain; charset=UTF-8 X-Virus-Checked: Checked by ClamAV on apache.org Please run it in the same style. The binary 'java' accepts a -cp param too: java -cp $($HADOOP_HOME/bin/hadoop classpath):. PutMerge On Mon, Jul 28, 2014 at 11:21 AM, R J wrote: > Thanks a lot! I could compile with the added classpath: > $javac -cp $($HADOOP_HOME/bin/hadoop classpath) PutMerge.java > The above created PutMerge.class file. > Now I try to run: > $java PutMerge > Exception in thread "main" java.lang.NoClassDefFoundError: PutMerge > Caused by: java.lang.ClassNotFoundException: PutMerge > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:248) > Could not find the main class: PutMerge. Program will exit. > > I get the above error. > I tried: > $set CLASSPATH=/usr/lib/hadoop/bin/hadoop > $java PutMerge > > I still get the error. > > > > On Sunday, July 27, 2014 10:16 PM, Harsh J wrote: > > > The javac program can only find import dependencies referenced in a > program if it is also supplied on the javac classpath. Setting > HADOOP_HOME alone will not magically do this. Have you set an > appropriate classpath? > > Try as below, perhaps: > > javac -cp $($HADOOP_HOME/bin/hadoop classpath) PutMerge.java > > Alternatively, consider using a modern build helper tool such as > Apache Maven for writing java applications, they make your work > easier. > > On Mon, Jul 28, 2014 at 6:16 AM, R J wrote: >> Hi All, >> >> I am new to programming on hadoop. I tried to compile the following >> program >> (example program from a hadoop book) on my linix server where I have >> Haddop >> installed: >> I get the errors: >> $javac PutMerge.java >> PutMerge.java:2: package org.apache.hadoop.conf does not exist >> import org.apache.hadoop.conf.Configuration; >> ^ >> PutMerge.java:3: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FSDataInputStream; >> ^ >> PutMerge.java:4: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FSDataOutputStream; >> ^ >> PutMerge.java:5: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FileStatus; >> ^ >> PutMerge.java:6: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.FileSystem; >> ^ >> PutMerge.java:7: package org.apache.hadoop.fs does not exist >> import org.apache.hadoop.fs.Path; >> >> I have $HADOOP_HOME set u: >> $echo $HADOOP_HOME >> /usr/lib/hadoop >> >> Could you please suggest how to compile this program? Thanks a lot. >> >> Shu >> >> >> ====PutMerge.java========= >> import java.io.IOException; >> import org.apache.hadoop.conf.Configuration; >> import org.apache.hadoop.fs.FSDataInputStream; >> import org.apache.hadoop.fs.FSDataOutputStream; >> import org.apache.hadoop.fs.FileStatus; >> import org.apache.hadoop.fs.FileSystem; >> import org.apache.hadoop.fs.Path; >> public class PutMerge { >> >> public static void main(String[] args) throws IOException { >> Configuration conf = new Configuration(); >> FileSystem hdfs = FileSystem.get(conf); >> FileSystem local = FileSystem.getLocal(conf); >> >> Path inputDir = new Path(args[0]); >> Path hdfsFile = new Path(args[1]); >> >> try { >> FileStatus[] inputFiles = local.listStatus(inputDir); >> FSDataOutputStream out = hdfs.create(hdfsFile); >> >> for (int i=0; i> System.out.println(inputFiles[i].getPath().getName()); >> FSDataInputStream in = local.open(inputFiles[i].getPath()); >> byte buffer[] = new byte[256]; >> int bytesRead = 0; >> while( (bytesRead = in.read(buffer)) > 0) { >> out.write(buffer, 0, bytesRead); >> } >> in.close(); >> } >> out.close(); >> } catch (IOException e) { >> e.printStackTrace(); > >> } >> } >> } >> ============= >> > > > > -- > Harsh J > > > -- Harsh J