crunch-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Gabriel Reid (JIRA)" <>
Subject [jira] [Commented] (CRUNCH-518) Can't build crunch-spark due to protobuf dependency
Date Fri, 08 May 2015 21:20:02 GMT


Gabriel Reid commented on CRUNCH-518:

Ah right, I even forgot that hadoop1 is the default build profile. I actually encountered
this while looking at CRUNCH-509, so I'm thinking it's not totally fixed for hadoop1 in that

Speaking of hadoop1, what's your view on when we can drop that? And any thoughts on at least
making hadoop-2 the default build profile?

> Can't build crunch-spark due to protobuf dependency
> ---------------------------------------------------
>                 Key: CRUNCH-518
>                 URL:
>             Project: Crunch
>          Issue Type: Bug
>            Reporter: Gabriel Reid
>         Attachments: CRUNCH-518.patch
> When trying to build clean build, I get errors like the following from crunch-spark:
> {code}
> [ERROR] /Users/greid/development/apache-projects/crunch/crunch-spark/src/it/scala/org/apache/crunch/scrunch/spark/PageRankTest.scala:49:
error: bad symbolic reference. A signature in PTypeH.class refers to term protobuf
> [ERROR] in package which is not available.
> [ERROR] It may be completely missing from the current classpath, or the version on
> [ERROR] the classpath might be incompatible with the version used when compiling PTypeH.class.
> [ERROR]     prev.cogroup(out).map((url, v) => {
> {code}
> It seems that this is due to protobuf being referenced from PTypeH, but since it's in
provided scope in crunch-scrunch, it doesn't get pulled in when building crunch-spark.

This message was sent by Atlassian JIRA

View raw message