pig-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Adam Szita (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (PIG-5297) Yarn-client mode doesn't work with Spark 2
Date Fri, 01 Sep 2017 13:23:00 GMT

     [ https://issues.apache.org/jira/browse/PIG-5297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Adam Szita updated PIG-5297:
----------------------------
    Description: 
When running tests in yarn-client mode that were built with Spark 2 I'm getting the following
exception:
{code}
Caused by: java.lang.IllegalStateException: Library directory '...../pig/assembly/target/scala-2.11/jars'
does not exist; make sure Spark is built.
	at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
	at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
	at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
	at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:558)
	at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:882)
{code}

After overcoming this with symlinks and setting SPARK_HOME I hit another issue:

{code}
Caused by: java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V
	at org.apache.spark.network.buffer.FileSegmentManagedBuffer.convertToNetty(FileSegmentManagedBuffer.java:133)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:58)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:33)
	at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:89)
{code}

I believe this will be an incompatibility between netty-all versions required by hadoop and
spark..

  was:
When running tests in yarn-client mode that were built with Spark 2 I'm getting the following
exception:
{code}
Caused by: java.lang.IllegalStateException: Library directory '...../pig/assembly/target/scala-2.11/jars'
does not exist; make sure Spark is built.
	at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
	at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
	at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
	at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:558)
	at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:882)
{code}

After overcoming this with symlinks and setting SPARK_MASTER I hit another issue:

{code}
Caused by: java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V
	at org.apache.spark.network.buffer.FileSegmentManagedBuffer.convertToNetty(FileSegmentManagedBuffer.java:133)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:58)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:33)
	at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:89)
{code}

I believe this will be an incompatibility between netty-all versions required by hadoop and
spark..


> Yarn-client mode doesn't work with Spark 2
> ------------------------------------------
>
>                 Key: PIG-5297
>                 URL: https://issues.apache.org/jira/browse/PIG-5297
>             Project: Pig
>          Issue Type: Sub-task
>          Components: spark
>            Reporter: Adam Szita
>            Assignee: Adam Szita
>
> When running tests in yarn-client mode that were built with Spark 2 I'm getting the following
exception:
> {code}
> Caused by: java.lang.IllegalStateException: Library directory '...../pig/assembly/target/scala-2.11/jars'
does not exist; make sure Spark is built.
> 	at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
> 	at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
> 	at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
> 	at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:558)
> 	at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:882)
> {code}
> After overcoming this with symlinks and setting SPARK_HOME I hit another issue:
> {code}
> Caused by: java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V
> 	at org.apache.spark.network.buffer.FileSegmentManagedBuffer.convertToNetty(FileSegmentManagedBuffer.java:133)
> 	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:58)
> 	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:33)
> 	at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:89)
> {code}
> I believe this will be an incompatibility between netty-all versions required by hadoop
and spark..



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message