spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-22770) When driver stopping, there is problem: org.apache.spark.SparkException: Could not find CoarseGrainedScheduler
Date Wed, 13 Dec 2017 13:22:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-22770?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16289241#comment-16289241
] 

Apache Spark commented on SPARK-22770:
--------------------------------------

User 'KaiXinXiaoLei' has created a pull request for this issue:
https://github.com/apache/spark/pull/19967

> When driver stopping, there is problem: org.apache.spark.SparkException: Could not find
CoarseGrainedScheduler
> --------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-22770
>                 URL: https://issues.apache.org/jira/browse/SPARK-22770
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.1
>            Reporter: KaiXinXIaoLei
>
> I run "spark-sql --master yarn --num-executors 1000 -f createTable.sql". When task is
finished, there is a error: org.apache.spark.SparkException: Could not find CoarseGrainedScheduler.
I think the log level should be warning, not error.
> {noformat}
> 17/12/12 18:30:16 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint
stopped!
> 17/12/12 18:30:16 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive()
for one-way message.
> org.apache.spark.SparkException: Could not find CoarseGrainedScheduler.
>         at org.apache.spark.rpc.netty.Dispatcher.postMessage(Dispatcher.scala:154)
>         at org.apache.spark.rpc.netty.Dispatcher.postOneWayMessage(Dispatcher.scala:134)
>         at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
>         at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:180)
>         at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
>         at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
>         at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
>         at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
>         at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
>         at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
>         at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message