spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Philip Weaver <philip.wea...@gmail.com>
Subject Location preferences in pyspark?
Date Sat, 17 Oct 2015 00:42:03 GMT
I believe what I want is the exact functionality provided by
SparkContext.makeRDD in Scala. For each element in the RDD, I want specify
a list of preferred hosts for processing that element.

It looks like this method only exists in Scala, and as far as I can tell
there is no similar functionality available in python. Is this true?

- Philip

Mime
View raw message