accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jianshi Huang <jianshi.hu...@gmail.com>
Subject Re: Need help. Error: java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeBase64String
Date Mon, 16 Jun 2014 22:20:51 GMT
Vikcy, the location of Base64 is very suprising:
(file:/mnt/mapreducelocal07/scratch/local/usercache/jianshuang/filecache/174/spark-assembly-1.0.0-hadoop2.2.0.jar
<no signer certificates>)

So spark-assembly is probably using the wrong version of commons-codec.
Looks like I need to reinstall spark from source.

Thank you all for the help!

Jianshi




On Mon, Jun 16, 2014 at 11:36 PM, Vicky Kak <vicky.kak@gmail.com> wrote:

> >>        at
> com.paypal.rtgraph.demo.MapReduceWriter.main(MapReduceWriter.scala)
>
> Can you add the following piece of code in MapReduceWriter,
>
> System.out.println(" codec location "+org.apache.commons.codec.
> binary.Base64.class.getProtectionDomain().getCodeSource());
>
> This should give the jar location of the apache coder librabry, the error
> that you get in the info says method not found in the class possibly due to
> version mismatch.
>
>
>
> On Mon, Jun 16, 2014 at 2:49 PM, Jianshi Huang <jianshi.huang@gmail.com>
> wrote:
>
>> Hi,
>>
>> I'm trying to use Accumulo with Spark writing to AccumuloOutputFormat. I
>> got the following errors in my spark app log:
>>
>> 14/06/16 02:01:44 INFO cluster.YarnClientClusterScheduler:
>> YarnClientClusterScheduler.postStartHook done
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> org.apache.commons.codec.binary.Base64.encodeBase64String([B)Ljava/lang/String;
>>         at
>> org.apache.accumulo.core.client.mapreduce.lib.impl.ConfiguratorBase.setConnectorInfo(ConfiguratorBase.java:127)
>>         at
>> org.apache.accumulo.core.client.mapreduce.AccumuloOutputFormat.setConnectorInfo(AccumuloOutputFormat.java:92)
>>         at
>> com.paypal.rtgraph.demo.MapReduceWriter$.main(MapReduceWriter.scala:44)
>>         at
>> com.paypal.rtgraph.demo.MapReduceWriter.main(MapReduceWriter.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> Anyone knows what's wrong with my code or settings? I've added all jars
>> to spark's classpath.
>>
>> And here's my spark-submit cmd:
>>
>> spark-submit --class com.paypal.rtgraph.demo.MapReduceWriter --master
>> yarn-client --jars `find lib -type f | tr '\n' ','` --driver-memory 2G
>> --executor-memory 20G --executor-cores 8 --num-executors 2 --verbose
>> rtgraph.jar
>>
>> If anyone has Accumulo+Spark examples, it would be great if they could be
>> shared. :)
>>
>>
>> Cheers,
>>
>> --
>> Jianshi Huang
>>
>> LinkedIn: jianshi
>> Twitter: @jshuang
>> Github & Blog: http://huangjs.github.com/
>>
>
>


-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Mime
View raw message