spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-9097) Tasks are not completed but the number of executor is zero
Date Fri, 17 Jul 2015 07:31:04 GMT

     [ https://issues.apache.org/jira/browse/SPARK-9097?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-9097.
------------------------------
          Resolution: Invalid
    Target Version/s:   (was: 1.5.0)

[~KaiXinXIaoLei] this remains unclear. You say it relates to dynamic allocation, but there's
no relation to that in your comment. You say there are no executors, but that's not what your
screenshot shows. (Still not sure you have read the JIRA guidance since you again set a field
you're asked not to.)

There may or may not be an issue here, but this is not moving towards communicating what it
is or a solution. May I suggest you focus on debugging the issue a little further, and providing
something reproducible or a proposed code change? that would help you and everyone else understand
what your situation is. Until then I do not think it is useful to create a JIRA like this.

> Tasks are not completed but the number of executor is zero
> ----------------------------------------------------------
>
>                 Key: SPARK-9097
>                 URL: https://issues.apache.org/jira/browse/SPARK-9097
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: KaiXinXIaoLei
>         Attachments: number of executor is zero.png, runing tasks.png
>
>
> I set the value of "spark.dynamicAllocation.enabled" is true, and set "spark.dynamicAllocation.minExecutors
= 0". I submit tasks to run. Tasks are not completed, but the number of executor is zero.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message