spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Virgil Palanciuc (JIRA)" <>
Subject [jira] [Commented] (SPARK-19552) Upgrade Netty version to 4.1.8 final
Date Tue, 14 Mar 2017 08:24:41 GMT


Virgil Palanciuc commented on SPARK-19552:

If this is going to be released in Spark 2.1.1, please make sure you upgrade to 4.1.9 final.

I've hit an issue where Spark would simply take forever to run - initially I suspected a skewed
join, but  after some more investigation I noticed it's stuck in {{io.netty.util.Recicler$Stack.scavengeSome}},
which lead me to this bug:
Apparently it's fixed in netty 4.0.43, but Spark 2.1.0 uses netty  4.0.42...  (the fix was
cherry-picked in the netty 4.1 line, and is available since 4.1.9)

> Upgrade Netty version to 4.1.8 final
> ------------------------------------
>                 Key: SPARK-19552
>                 URL:
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.1.0
>            Reporter: Adam Roberts
>            Priority: Minor
> Netty 4.1.8 was recently released but isn't API compatible with previous major versions
(like Netty 4.0.x), see for
> This version does include a fix for a security concern but not one we'd be exposed to
with Spark "out of the box". Let's upgrade the version we use to be on the safe side as the
security fix I'm especially interested in is not available in the 4.0.x release line. 
> We should move up anyway to take on a bunch of other big fixes cited in the release notes
(and if anyone were to use Spark with netty and tcnative, they shouldn't be exposed to the
security problem) - we should be good citizens and make this change.
> As this 4.1 version involves API changes we'll need to implement a few methods and possibly
adjust the Sasl tests. This JIRA and associated pull request starts the process which I'll
work on - and any help would be much appreciated! Currently I know:
> {code}
> @Override
> public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise promise)
>       throws Exception {
>       if (!foundEncryptionHandler) {
>         foundEncryptionHandler =
>  != null; <-- this returns
false and causes test failures
>       }
>       ctx.write(msg, promise);
>     }
> {code}
> Here's what changes will be required (at least):
> {code}
> common/network-common/src/main/java/org/apache/spark/network/crypto/{code}
requires touch, retain and transferred methods
> {code}
> common/network-common/src/main/java/org/apache/spark/network/sasl/{code}
requires the above methods too
> {code}common/network-common/src/test/java/org/apache/spark/network/protocol/{code}
> With "dummy" implementations so we can at least compile and test, we'll see five new
test failures to address.
> These are
> {code}
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
> {code}

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message