spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <>
Subject Re: how to replace hdfs with a custom distributed fs ?
Date Sat, 11 Nov 2017 17:38:07 GMT
You can implement the Hadoop FileSystem API for your distributed java fs
and just plug into Spark using the Hadoop API.

On Sat, Nov 11, 2017 at 9:37 AM, Cristian Lorenzetto <> wrote:

> hi i have my distributed java fs and i would like to implement my class
> for storing data in spark.
> How to do? it there a example how to do?

View raw message