flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Schmidtke <ro.schmid...@gmail.com>
Subject Reading worker-local input files
Date Tue, 27 Dec 2016 12:04:44 GMT
Hi everyone,

I'm using Flink and/or Hadoop on my cluster, and I'm having them generate
log data in each worker node's /local folder (regular mount point). Now I
would like to process these files using Flink, but I'm not quite sure how I
could tell Flink to use each worker node's /local folder as input path,
because I'd expect Flink to look in the /local folder of the submitting
node only. Do I have to put these files into HDFS or is there a way to tell
Flink the file:///local file URI refers to worker-local data? Thanks in
advance for any hints and best

Robert

-- 
My GPG Key ID: 336E2680

Mime
View raw message