Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BC84C171E8 for ; Tue, 3 Feb 2015 18:01:09 +0000 (UTC) Received: (qmail 9466 invoked by uid 500); 3 Feb 2015 18:00:47 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 9366 invoked by uid 500); 3 Feb 2015 18:00:47 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 9356 invoked by uid 99); 3 Feb 2015 18:00:47 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 03 Feb 2015 18:00:47 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of xeonmailinglist@gmail.com designates 209.85.212.172 as permitted sender) Received: from [209.85.212.172] (HELO mail-wi0-f172.google.com) (209.85.212.172) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 03 Feb 2015 18:00:40 +0000 Received: by mail-wi0-f172.google.com with SMTP id h11so26313102wiw.5 for ; Tue, 03 Feb 2015 10:00:19 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=message-id:date:from:user-agent:mime-version:to:subject:references :in-reply-to:content-type; bh=XSvmw76wrctmYjeXVN3vb/G4sN3+OFyC6y2cBk6UtKc=; b=s89N/c0fRJUn65CLnU+Du8x8ZB+L8gDe+Sxc+gYfTsq7pba1FfaqQqnzvzvf0WM629 h5ro5JC1G2NiYGAbbw8qWATeZjd7K4XY62W/4GMvzy58pdpAf8AT2CCkBf3qd7nBRKM4 5sAjBYaQbq/9JHWhiQkDyBtw1w4DQBihVjbm1W77/QROR0MHslcFf+sXLy7f8P0lIwvv 5OG8r896n2OPz9EjtZ03OfYG8licLbCV0Y5h4LKoon5HwnEQfWcNVfx/zRpjjxwPU72p D7nEWKXrxiNiI6wns5/z3if1oh4jEa54U1uIlRzPWWmhkDwJXOWNy5Cen86MqqEnReFF kxSw== X-Received: by 10.194.122.38 with SMTP id lp6mr56074176wjb.24.1422986419687; Tue, 03 Feb 2015 10:00:19 -0800 (PST) Received: from [10.101.231.86] ([194.117.18.101]) by mx.google.com with ESMTPSA id 18sm33633767wjr.46.2015.02.03.10.00.18 for (version=TLSv1.2 cipher=ECDHE-RSA-AES128-GCM-SHA256 bits=128/128); Tue, 03 Feb 2015 10:00:18 -0800 (PST) Message-ID: <54D10CB1.5060002@gmail.com> Date: Tue, 03 Feb 2015 18:00:17 +0000 From: xeonmailinglist User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.4.0 MIME-Version: 1.0 To: user@hadoop.apache.org Subject: Re: Pass lib jars when invoking an hadoop program References: <54D0E208.1040702@gmail.com> In-Reply-To: <54D0E208.1040702@gmail.com> Content-Type: multipart/alternative; boundary="------------030309090308060504050200" X-Virus-Checked: Checked by ClamAV on apache.org This is a multi-part message in MIME format. --------------030309090308060504050200 Content-Type: text/plain; charset=utf-8; format=flowed Content-Transfer-Encoding: 8bit Got it. Here's the solution: ``` vagrant@hadoop-coc-1:~/Programs/hadoop$ export HADOOP_CLASSPATH=share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar; hadoop jar wordcount.jar -libjars $HADOOP_HOME/share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar /input1 /outputmp /output1 ``` On 03-02-2015 14:58, xeonmailinglist wrote: > > Hi, > > I am trying to run |distcp| using a java class, but I get the error of > class not found |DistCpOptions|. I have used the argument |-libjars > ./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar| to pass the jar > file, but it seems that is not right. How I pass the lib properly? > > Output: > > |vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop jar wordcount.jar -libjars ./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar /input1 /outputmp /output1 > Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml > -libjars > ./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar > /input1 > /outputmp > /output1 > Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/tools/DistCpOptions > at org.apache.hadoop.mapred.examples.WordCount.main(WordCount.java:101) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at org.apache.hadoop.util.RunJar.run(RunJar.java:221) > at org.apache.hadoop.util.RunJar.main(RunJar.java:136) > Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.tools.DistCpOptions > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > | > > My class: > > |public static void main(String[] args) throws Exception { > Configuration conf = new Configuration(); > > String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs(); > if (otherArgs.length < 2) { > System.err.println("Usage: wordcount [...] "); > System.exit(2); > } > > System.out.println(conf.toString()); > for (int i = 0; i < args.length; i++) { > System.out.println(args[i]); > } > > // distcp > String proto = "webhdfs://"; > String src = "hadoop-coc-1/input1"; > String dest = "hadoop-coc-2/input1"; > List lsrc = new ArrayList(); > lsrc.add(new Path(src)); > DistCpOptions options = new DistCpOptions(lsrc, new Path(dest)); > DistCp distcp = new DistCp(new Configuration(), options); > distcp.execute(); > } > | > ​ --------------030309090308060504050200 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: 8bit Got it. Here's the solution:

```
vagrant@hadoop-coc-1:~/Programs/hadoop$ export HADOOP_CLASSPATH=share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar; hadoop jar wordcount.jar -libjars $HADOOP_HOME/share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar /input1 /outputmp /output1
```

On 03-02-2015 14:58, xeonmailinglist wrote:

Hi,

I am trying to run distcp using a java class, but I get the error of class not found DistCpOptions. I have used the argument -libjars ./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar to pass the jar file, but it seems that is not right. How I pass the lib properly?

Output:

vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop jar wordcount.jar  -libjars ./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar /input1 /outputmp /output1
Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml
-libjars
./share/hadoop/tools/lib/hadoop-distcp-2.6.0.jar
/input1
/outputmp
/output1
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/tools/DistCpOptions
    at org.apache.hadoop.mapred.examples.WordCount.main(WordCount.java:101)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.tools.DistCpOptions
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

My class:

public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();

        String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
        if (otherArgs.length < 2) {
            System.err.println("Usage: wordcount <in> [<in>...] <out>");
            System.exit(2);
        }

        System.out.println(conf.toString());
        for (int i = 0; i < args.length; i++) {
            System.out.println(args[i]);
        }

        // distcp
        String proto = "webhdfs://";
        String src = "hadoop-coc-1/input1";
        String dest = "hadoop-coc-2/input1";
        List<Path> lsrc = new ArrayList<Path>();
        lsrc.add(new Path(src));
        DistCpOptions options = new DistCpOptions(lsrc, new Path(dest));
        DistCp distcp = new DistCp(new Configuration(), options); 
        distcp.execute();
    }

--------------030309090308060504050200--