spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From viirya <...@git.apache.org>
Subject [GitHub] spark pull request #19184: [SPARK-21971][CORE] Too many open files in Spark ...
Date Mon, 11 Sep 2017 05:22:26 GMT
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19184#discussion_r137981905
  
    --- Diff: core/src/main/java/org/apache/spark/util/collection/unsafe/sort/UnsafeSorterSpillReader.java
---
    @@ -104,6 +124,10 @@ public void loadNext() throws IOException {
         if (taskContext != null) {
           taskContext.killTaskIfInterrupted();
         }
    +    if (this.din == null) {
    +      // Good time to init (if all files are opened, we can get Too Many files exception)
    --- End diff --
    
    This comment looks confusing. Maybe `It is the time to initialize and hold the input stream
of the spill file for loading records. Keeps the input stream open too early will very possibly
encounter too many file open issue.`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message