accumulo-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Josh Elser (JIRA)" <>
Subject [jira] [Commented] (ACCUMULO-3100) Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
Date Fri, 05 Sep 2014 22:08:32 GMT


Josh Elser commented on ACCUMULO-3100:

bq. Doing this right will require ACCUMULO-1483, right?

That's half of it. The other half would be making sure that user iterators on the server are
also happy.

bq. if that doesn't happen in time we document that Accumulo 1.6 only runs on up to Hadoop

I'd be -1 for this. Like you said, we don't make guarantees on what dependencies we provide
version to version. I still think the best immediate solution is to downgrade to 14.0.1 and
work to get Hadoop to a more recent version a quicker timeframe as well as a better user/server
space isolation story. If users want to substitute their own version of Guava, they can still
do so, but it's tagged as a YMMV. We advertise dependencies that actually work across the
breadth of (stable) Hadoop-2 and users who want newer versions of dependencies can provide
them on their own.

> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>                 Key: ACCUMULO-3100
>                 URL:
>             Project: Accumulo
>          Issue Type: Bug
>          Components: test
>    Affects Versions: 1.6.0
>            Reporter: Josh Elser
>            Assignee: Josh Elser
>             Fix For: 1.6.1, 1.7.0
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle support that
was recently added to branch-2 (specifically HDFS-6134 and HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
> 	at$
> 	at$
> 	at Method)
> 	at
> 	at java.lang.ClassLoader.loadClass(
> 	at sun.misc.Launcher$AppClassLoader.loadClass(
> 	at java.lang.ClassLoader.loadClass(
> 	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
> 	at org.apache.hadoop.mapreduce.Job$
> 	at org.apache.hadoop.mapreduce.Job$
> 	at Method)
> 	at
> 	at
> 	at org.apache.hadoop.mapreduce.Job.submit(
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream because
we depend on Guava 15.0 which doesn't contain LimitInputStream.

This message was sent by Atlassian JIRA

View raw message