bigtop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kelly, Jonathan" <jonat...@amazon.com>
Subject Re: Spark v1.2.1 failing under BigTop build in External Flume Sink (due to missing Netty library)
Date Fri, 06 Mar 2015 01:06:06 GMT
Yeah, I saw all of the JIRA comments and the patch.  You guys are ridiculously quick!  I'm
testing it out myself locally now.

Thanks a lot,
Jonathan Kelly
Elastic MapReduce - SDE
Port 99 (SEA35) 08.220.C2

From: 김영우 <warwithin@gmail.com<mailto:warwithin@gmail.com>>
Reply-To: "user@bigtop.apache.org<mailto:user@bigtop.apache.org>" <user@bigtop.apache.org<mailto:user@bigtop.apache.org>>
Date: Friday, March 6, 2015 at 3:00 AM
To: "user@bigtop.apache.org<mailto:user@bigtop.apache.org>" <user@bigtop.apache.org<mailto:user@bigtop.apache.org>>
Subject: Re: FW: Spark v1.2.1 failing under BigTop build in External Flume Sink (due to missing
Netty library)

Filed https://issues.apache.org/jira/browse/BIGTOP-1727 and uploaded a patch.

Thanks,
Youngwoo

On Fri, Mar 6, 2015 at 9:38 AM, 김영우 <warwithin@gmail.com<mailto:warwithin@gmail.com>>
wrote:
Gotcha.

BIGTOP-1716 causes the build failure. I'm looking into this.

Sorry for your inconvenience, I'll upload a patch.

Thanks,
Youngwoo

On Fri, Mar 6, 2015 at 8:05 AM, jay vyas <jayunit100.apache@gmail.com<mailto:jayunit100.apache@gmail.com>>
wrote:
Hi jonathan !

I did indeed build and test spark 1.2.1 in BIGTOP-1648 : And actually during the review i
pasted the text output : Seemed to work nicely : https://issues.apache.org/jira/browse/BIGTOP-1648

Lets follow up on this here https://issues.apache.org/jira/browse/BIGTOP-1726, where we can
retest everything.  Its quite easy to retest will leave some guidance directions there if
you want to try it out.


On Thu, Mar 5, 2015 at 5:04 PM, Kelly, Jonathan <jonathak@amazon.com<mailto:jonathak@amazon.com>>
wrote:
As I said below, I don't think this could be a BigTop issue, but has
anybody from the BigTop community seen anything like this?

Thanks,
Jonathan Kelly




On 3/5/15, 1:34 PM, "Kelly, Jonathan" <jonathak@amazon.com<mailto:jonathak@amazon.com>>
wrote:

>That's probably a good thing to have, so I'll add it, but unfortunately it
>did not help this issue.  It looks like the hadoop-2.4 profile only sets
>these properties, which don't seem like they would affect anything related
>to Netty:
>
>      <properties>
>        <hadoop.version>2.4.0</hadoop.version>
>        <protobuf.version>2.5.0</protobuf.version>
>        <jets3t.version>0.9.0</jets3t.version>
>        <commons.math3.version>3.1.1</commons.math3.version>
>        <avro.mapred.classifier>hadoop2</avro.mapred.classifier>
>      </properties>
>
>
>Thanks,
>Jonathan Kelly
>
>
>
>
>On 3/5/15, 1:09 PM, "Patrick Wendell" <pwendell@gmail.com<mailto:pwendell@gmail.com>>
wrote:
>
>>You may need to add the -Phadoop-2.4 profile. When building or release
>>packages for Hadoop 2.4 we use the following flags:
>>
>>-Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
>>
>>- Patrick
>>
>>On Thu, Mar 5, 2015 at 12:47 PM, Kelly, Jonathan <jonathak@amazon.com<mailto:jonathak@amazon.com>>
>>wrote:
>>> I confirmed that this has nothing to do with BigTop by running the same
>>>mvn
>>> command directly in a fresh clone of the Spark package at the v1.2.1
>>>tag.  I
>>> got the same exact error.
>>>
>>>
>>>~ Jonathan Kelly
>>>
>>>
>>> From: <Kelly>, Jonathan Kelly <jonathak@amazon.com<mailto:jonathak@amazon.com>>
>>> Date: Thursday, March 5, 2015 at 10:39 AM
>>> To: "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>>
>>> Subject: Spark v1.2.1 failing under BigTop build in External Flume Sink
>>>(due
>>> to missing Netty library)
>>>
>>> I'm running into an issue building Spark v1.2.1 (as well as the latest
>>>in
>>> branch-1.2 and v1.3.0-rc2 and the latest in branch-1.3) with BigTop
>>>(v0.9,
>>> which is not quite released yet).  The build fails in the External
>>>Flume
>>> Sink subproject with the following error:
>>>
>>> [INFO] Compiling 5 Scala sources and 3 Java sources to
>>>
>>>/workspace/workspace/bigtop.spark-rpm/build/spark/rpm/BUILD/spark-1.3.0/
>>>e
>>>xternal/flume-sink/target/scala-2.10/classes...
>>> [WARNING] Class org.jboss.netty.channel.ChannelFactory not found -
>>> continuing with a stub.
>>> [ERROR] error while loading NettyServer, class file
>>>
>>>'/home/ec2-user/.m2/repository/org/apache/avro/avro-ipc/1.7.6/avro-ipc-1
>>>.
>>>7.6.jar(org/apache/avro/ipc/NettyServer.class)'
>>> is broken
>>> (class java.lang.NullPointerException/null)
>>> [WARNING] one warning found
>>> [ERROR] one error found
>>>
>>> It seems like what is happening is that the Netty library is missing at
>>> build time, which happens because it is explicitly excluded in the
>>>pom.xml
>>> (see
>>>
>>>https://github.com/apache/spark/blob/v1.2.1/external/flume-sink/pom.xml#
>>>L
>>>42).
>>> I attempted removing the exclusions and the explicit re-add for the
>>>test
>>> scope on lines 77-88, and that allowed the build to succeed, though I
>>>don't
>>> know if that will cause problems at runtime.  I don't have any
>>>experience
>>> with the Flume Sink, so I don't really know how to test it.  (And, to
>>>be
>>> clear, I'm not necessarily trying to get the Flume Sink to work-- I
>>>just
>>> want the project to build successfully, though of course I'd still want
>>>the
>>> Flume Sink to work for whomever does need it.)
>>>
>>> Does anybody have any idea what's going on here?  Here is the command
>>>BigTop
>>> is running to build Spark:
>>>
>>> mvn -Pbigtop-dist -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl
>>> -Divy.home=/home/ec2-user/.ivy2 -Dsbt.ivy.home=/home/ec2-user/.ivy2
>>> -Duser.home=/home/ec2-user -Drepo.maven.org<http://Drepo.maven.org>=
>>> -Dreactor.repo=file:///home/ec2-user/.m2/repository
>>> -Dhadoop.version=2.4.0-amzn-3-SNAPSHOT
>>>-Dyarn.version=2.4.0-amzn-3-SNAPSHOT
>>> -Dprotobuf.version=2.5.0 -Dscala.version=2.10.3
>>>-Dscala.binary.version=2.10
>>> -DskipTests -DrecompileMode=all install
>>>
>>> As I mentioned above, if I switch to the latest in branch-1.2, to
>>> v1.3.0-rc2, or to the latest in branch-1.3, I get the same exact error.
>>> I
>>> was not getting the error with Spark v1.1.0, though there weren't any
>>> changes to the external/flume-sink/pom.xml between v1.1.0 and v1.2.1.
>>>
>>>
>>> ~ Jonathan Kelly
>




--
jay vyas


Mime
View raw message