accumulo-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Josh Elser (JIRA)" <>
Subject [jira] [Commented] (ACCUMULO-3100) Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
Date Fri, 05 Sep 2014 16:57:28 GMT


Josh Elser commented on ACCUMULO-3100:

bq. The issue with MAC was that the maven plugins were using a fork of guava, hence the ordering

No, that's a completely separate issue about some sisu-plugins repackaging Guava classes.
ACCUMULO-2714, linked above, is purely because we depend on 15.0 and Hadoop-2 on 14.0.1.

bq.  If we pull back the version, we break compatibility for all users who may be relying
on guava features introduced after whatever version hadoop is using

We're also right now playing with fire for any new version of Hadoop. Like I said, the only
reason this hasn't bit us in production yet is because the Hadoop client APIs don't use them
directly. Right now, it's extremely confusing and misleading because Accumulo-1.6.0 with Guava
15.0 (which is implied to work given our dependencies) does not work with anything >=Hadoop-2.4.0.
This is misleading to clients.

bq. our previous guava version that DID affect us

Please inform us what the actual bugs were as I already stated that I didn't see any between
14.0.1 and 15.0 that were likely to affect our usage of Guava.

> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>                 Key: ACCUMULO-3100
>                 URL:
>             Project: Accumulo
>          Issue Type: Bug
>          Components: test
>    Affects Versions: 1.6.0
>            Reporter: Josh Elser
>            Assignee: Josh Elser
>             Fix For: 1.6.1, 1.7.0
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle support that
was recently added to branch-2 (specifically HDFS-6134 and HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
> 	at$
> 	at$
> 	at Method)
> 	at
> 	at java.lang.ClassLoader.loadClass(
> 	at sun.misc.Launcher$AppClassLoader.loadClass(
> 	at java.lang.ClassLoader.loadClass(
> 	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(
> 	at org.apache.hadoop.mapreduce.Job$
> 	at org.apache.hadoop.mapreduce.Job$
> 	at Method)
> 	at
> 	at
> 	at org.apache.hadoop.mapreduce.Job.submit(
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream because
we depend on Guava 15.0 which doesn't contain LimitInputStream.

This message was sent by Atlassian JIRA

View raw message