accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Elser <josh.el...@gmail.com>
Subject Re: Need help. Error: java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeBase64String
Date Mon, 16 Jun 2014 15:34:04 GMT
By the error, it looks like it might be the "wrong" version of 
commons-codec (it's a NoSuchMethodException instead of a 
NoClassDefFoundError). Depending on the version/distro of Hadoop you're 
running against, some things that are provided by Apache just aren't 
provided by your version. Feel free to sent back Hadoop version info if 
you need some more help tracking down the problem.

It looks like for Accumulo 1.5, 1.6 and 1.7 (current unreleased master 
branch), we depend on commons-codec-1.4 if that helps you track down 
where this dependency is/isn't coming from.

- Josh

On 6/16/14, 6:16 AM, Sean Busbey wrote:
> Hi Jianshi!
>
> What version of Accumulo are you using?
>
> I believe you're running into a problem with what Accumulo presumes will
> be provided by Hadoop. The immediate fault is that Accumulo wants
> commons-codec. It's normally in the classpath as a part of Hadoop client.
>
> To make sure you have the entire classpath, I believe you're going to
> have to use the "Accumulo classpath" command:
>
> ${ACCUMULO_HOME}/bin/accumulo classpath
>
> Unfortunately, the output of this command is not tailored towards
> passing to another process, so you'll have to do some follow-on munging.
> In many of the Accumulo set ups I've seen, there are far more jars
> included in this classpath than Accumulo actually needs, so it might be
> best to use it as a guide for what you include manually.
>
>
> On Mon, Jun 16, 2014 at 5:19 AM, Jianshi Huang <jianshi.huang@gmail.com
> <mailto:jianshi.huang@gmail.com>> wrote:
>
>     Hi,
>
>     I'm trying to use Accumulo with Spark writing to
>     AccumuloOutputFormat. I got the following errors in my spark app log:
>
>     14/06/16 02:01:44 INFO cluster.YarnClientClusterScheduler:
>     YarnClientClusterScheduler.postStartHook done
>     Exception in thread "main" java.lang.NoSuchMethodError:
>     org.apache.commons.codec.binary.Base64.encodeBase64String([B)Ljava/lang/String;
>              at
>     org.apache.accumulo.core.client.mapreduce.lib.impl.ConfiguratorBase.setConnectorInfo(ConfiguratorBase.java:127)
>              at
>     org.apache.accumulo.core.client.mapreduce.AccumuloOutputFormat.setConnectorInfo(AccumuloOutputFormat.java:92)
>              at
>     com.paypal.rtgraph.demo.MapReduceWriter$.main(MapReduceWriter.scala:44)
>              at
>     com.paypal.rtgraph.demo.MapReduceWriter.main(MapReduceWriter.scala)
>              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>              at
>     sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>              at
>     sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>              at java.lang.reflect.Method.invoke(Method.java:606)
>              at
>     org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
>              at
>     org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>              at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>     Anyone knows what's wrong with my code or settings? I've added all
>     jars to spark's classpath.
>
>     And here's my spark-submit cmd:
>
>     spark-submit --class com.paypal.rtgraph.demo.MapReduceWriter
>     --master yarn-client --jars `find lib -type f | tr '\n' ','`
>     --driver-memory 2G --executor-memory 20G --executor-cores 8
>     --num-executors 2 --verbose rtgraph.jar
>
>     If anyone has Accumulo+Spark examples, it would be great if they
>     could be shared. :)
>
>
>     Cheers,
>
>     --
>     Jianshi Huang
>
>     LinkedIn: jianshi
>     Twitter: @jshuang
>     Github & Blog: http://huangjs.github.com/
>
>
>
>
> --
> Sean

Mime
View raw message