spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Reynold Xin (JIRA)" <j...@apache.org>
Subject [jira] [Closed] (SPARK-4195) retry to fetch blocks's result when fetchfailed's reason is connection timeout
Date Fri, 07 Nov 2014 02:41:34 GMT

     [ https://issues.apache.org/jira/browse/SPARK-4195?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Reynold Xin closed SPARK-4195.
------------------------------
    Resolution: Duplicate
      Assignee: Aaron Davidson

> retry to fetch blocks's result when fetchfailed's reason is connection timeout
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-4195
>                 URL: https://issues.apache.org/jira/browse/SPARK-4195
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Lianhui Wang
>            Assignee: Aaron Davidson
>
> when there are many executors in a application(example:1000),Connection timeout often
occure.Exception is:
> WARN nio.SendingConnection: Error finishing connection 
> java.net.ConnectException: Connection timed out
>         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>         at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>         at org.apache.spark.network.nio.SendingConnection.finishConnect(Connection.scala:342)
>         at org.apache.spark.network.nio.ConnectionManager$$anon$11.run(ConnectionManager.scala:273)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:744)
> that will make driver as these executors are lost,but in fact these executors are alive.so
add retry mechanism to reduce the probability of the occurrence of this problem.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message