spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [spark] turboFei commented on a change in pull request #25795: [SPARK-29037][Core] Spark gives duplicate result when an application was killed
Date Tue, 17 Sep 2019 07:31:44 GMT
turboFei commented on a change in pull request #25795: [SPARK-29037][Core] Spark gives duplicate
result when an application was killed
URL: https://github.com/apache/spark/pull/25795#discussion_r325018791
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/internal/io/HadoopMapReduceCommitProtocol.scala
 ##########
 @@ -160,11 +160,15 @@ class HadoopMapReduceCommitProtocol(
 
     val taskAttemptContext = new TaskAttemptContextImpl(jobContext.getConfiguration, taskAttemptId)
     committer = setupCommitter(taskAttemptContext)
-    committer.setupJob(jobContext)
+    if (!dynamicPartitionOverwrite) {
 
 Review comment:
   When a job is killed, its staging dir can be cleaned up by `abortJob` method.
   But when an application is killed, its job's staging dir would not be cleaned up gracefully.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message