accumulo-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Christopher Tubbs (JIRA)" <>
Subject [jira] [Commented] (ACCUMULO-1244) commons-io version conflict with CDH4
Date Mon, 22 Apr 2013 15:17:15 GMT


Christopher Tubbs commented on ACCUMULO-1244:

{quote}Is it a necessity for the dependencies to be marked provided in the module poms, or
could we move the provided markings to the top-level pom?{quote}

I would argue that scopes are best put in module poms, because the same artifact may have
two different scopes in two different modules in a multi-module project, and it gets confusing
when the default "compile" scope is explicit in some cases (to override the parent) and implicit
in other cases.

Specifying scopes this way, also helps simplify the copy-dependencies plugin configuration
a lot, and makes the behavior of plugins that use the scopes more predictable (vs. the extra
configuration in our assembly to get around the messiness of picking and choosing what to
include). It also makes our integration testing environment much more realistic. The biggest
downside I'm aware of is that provided scope is not resolved transitively in the unit testing
phase of the build lifecycle.

{quote}When I try to init, it throws an exception because it can't find log4j.{quote}
I don't get that. Have you figured this out yet?
> commons-io version conflict with CDH4
> -------------------------------------
>                 Key: ACCUMULO-1244
>                 URL:
>             Project: Accumulo
>          Issue Type: Bug
>         Environment: Hadoop version 2.0.0-CDH4.2.0
>            Reporter: Adam Fuchs
>            Assignee: Christopher Tubbs
>             Fix For: 1.5.0
> CDH4 appears to rely on commons-io version 2.0 or greater. Accumulo currently packages
in version 1.4. We should bump this up to achieve compatibility.
> Workaround: put the hadoop dependency libraries before the accumulo dependency libraries
in the general.classpaths variable in accumulo-site.xml.
> {code}
> 2013-04-04 22:27:13,868 [tabletserver.Tablet] ERROR: Unknown error during minor compaction
for extent: !0;~;!0<
> java.lang.RuntimeException: java.lang.NoSuchMethodError:;)V
>   at org.apache.accumulo.server.tabletserver.Tablet.minorCompact(
>   at org.apache.accumulo.server.tabletserver.Tablet.access$4400(
>   at org.apache.accumulo.server.tabletserver.Tablet$
>   at
>   at
>   at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
>   at java.util.concurrent.ThreadPoolExecutor$
>   at
>   at
>   at
> Caused by: java.lang.NoSuchMethodError:;)V
>   at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(
>   at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(
>   at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(
>   at
>   at
>   at
>   at
>   at org.apache.accumulo.core.file.rfile.bcfile.BCFile$Reader.<init>(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.init(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getBCFile(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.access$000(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader$MetaBlockLoader.get(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getBlock(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getMetaBlock(
>   at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getMetaBlock(
>   at org.apache.accumulo.core.file.rfile.RFile$Reader.<init>(
>   at org.apache.accumulo.core.file.rfile.RFileOperations.openReader(
>   at org.apache.accumulo.core.file.DispatchingFileFactory.openReader(
>   at
>   at
>   at org.apache.accumulo.server.tabletserver.Tablet.minorCompact(
>   ... 9 more
> {code}

This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see:

View raw message