spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ash211 <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-3408] Fixed Limit operator so it works ...
Date Fri, 05 Sep 2014 08:20:31 GMT
Github user ash211 commented on the pull request:

    https://github.com/apache/spark/pull/2281#issuecomment-54597321
  
    I don't see that contract in the API documented in the Scaladoc for the method:
    
    ```
     588   /**
     589    * Return a new RDD by applying a function to each partition of this RDD.
     590    *
     591    * `preservesPartitioning` indicates whether the input function preserves the partitioner,
which
     592    * should be `false` unless this is a pair RDD and the input function doesn't modify
the keys.
     593    */
     594   def mapPartitions[U: ClassTag](
     595       f: Iterator[T] => Iterator[U], preservesPartitioning: Boolean = false):
RDD[U] = {
     596     val func = (context: TaskContext, index: Int, iter: Iterator[T]) => f(iter)
     597     new MapPartitionsRDD(this, sc.clean(func), preservesPartitioning)
     598   }
    ```
    
    Should I send a PR documenting it?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message