spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-14234) Executor crashes for TaskRunner thread interruption
Date Tue, 29 Mar 2016 08:30:25 GMT

    [ https://issues.apache.org/jira/browse/SPARK-14234?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15215679#comment-15215679
] 

Apache Spark commented on SPARK-14234:
--------------------------------------

User 'devaraj-kavali' has created a pull request for this issue:
https://github.com/apache/spark/pull/12031

> Executor crashes for TaskRunner thread interruption
> ---------------------------------------------------
>
>                 Key: SPARK-14234
>                 URL: https://issues.apache.org/jira/browse/SPARK-14234
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Devaraj K
>
> If the TaskRunner thread gets interrupted while running due to task kill or any other
reason, the interrupted thread will try to update the task status as part of the exception
handling and fails with the below exception. This is happening from all of these catch blocks
statusUpdate calls, below are the exceptions correspondingly for all these catch cases.
> {code:title=Executor.scala|borderStyle=solid}
>         case _: TaskKilledException | _: InterruptedException if task.killed =>
>          ......
>         case cDE: CommitDeniedException =>
>          ......
>         case t: Throwable =>
>          ......
> {code}
> {code:xml}
> 16/03/29 17:32:33 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor
task launch worker-2,5,main]
> java.lang.Error: java.nio.channels.ClosedByInterruptException
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:744)
> Caused by: java.nio.channels.ClosedByInterruptException
> 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202)
> 	at java.nio.channels.Channels$WritableByteChannelImpl.write(Channels.java:460)
> 	at org.apache.spark.util.SerializableBuffer$$anonfun$writeObject$1.apply(SerializableBuffer.scala:49)
> 	at org.apache.spark.util.SerializableBuffer$$anonfun$writeObject$1.apply(SerializableBuffer.scala:47)
> 	at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1204)
> 	at org.apache.spark.util.SerializableBuffer.writeObject(SerializableBuffer.scala:47)
> 	at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
> 	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495)
> 	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> 	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
> 	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
> 	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> 	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
> 	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
> 	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> 	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
> 	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:43)
> 	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
> 	at org.apache.spark.rpc.netty.NettyRpcEnv.serialize(NettyRpcEnv.scala:253)
> 	at org.apache.spark.rpc.netty.NettyRpcEnv.send(NettyRpcEnv.scala:192)
> 	at org.apache.spark.rpc.netty.NettyRpcEndpointRef.send(NettyRpcEnv.scala:513)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend.statusUpdate(CoarseGrainedExecutorBackend.scala:135)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	... 2 more
> {code}
> {code:xml}
> 16/03/29 08:00:29 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor
task launch worker-4,5,main]
> java.lang.Error: java.nio.channels.ClosedByInterruptException
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.nio.channels.ClosedByInterruptException
> 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202)
> 	at java.nio.channels.Channels$WritableByteChannelImpl.write(Channels.java:460)
> 	..................
> 	at org.apache.spark.rpc.netty.NettyRpcEnv.send(NettyRpcEnv.scala:192)
> 	at org.apache.spark.rpc.netty.NettyRpcEndpointRef.send(NettyRpcEnv.scala:513)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend.statusUpdate(CoarseGrainedExecutorBackend.scala:135)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:326)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	... 2 more
> 16/03/29 08:00:29 INFO DiskBlockManager: Shutdown hook called
> {code}
> {code:xml}
> 16/03/29 17:28:56 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor
task launch worker-3,5,main]
> java.lang.Error: java.nio.channels.ClosedByInterruptException
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:744)
> Caused by: java.nio.channels.ClosedByInterruptException
> 	at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202)
> 	at java.nio.channels.Channels$WritableByteChannelImpl.write(Channels.java:460)
> 	..................
> 	at org.apache.spark.rpc.netty.NettyRpcEndpointRef.send(NettyRpcEnv.scala:513)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend.statusUpdate(CoarseGrainedExecutorBackend.scala:135)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:355)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	... 2 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message