flink-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From fhue...@apache.org
Subject incubator-flink git commit: [FLINK-1139] Updated Hadoop Compatibility documentation
Date Mon, 08 Dec 2014 09:38:20 GMT
Repository: incubator-flink
Updated Branches:
  refs/heads/master 15f58bb23 -> f945e2c9c


[FLINK-1139] Updated Hadoop Compatibility documentation


Project: http://git-wip-us.apache.org/repos/asf/incubator-flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-flink/commit/f945e2c9
Tree: http://git-wip-us.apache.org/repos/asf/incubator-flink/tree/f945e2c9
Diff: http://git-wip-us.apache.org/repos/asf/incubator-flink/diff/f945e2c9

Branch: refs/heads/master
Commit: f945e2c9c9c694b584dd000cf1d32b566f414fee
Parents: 15f58bb
Author: Fabian Hueske <fhueske@apache.org>
Authored: Mon Dec 8 10:37:22 2014 +0100
Committer: Fabian Hueske <fhueske@apache.org>
Committed: Mon Dec 8 10:37:43 2014 +0100

----------------------------------------------------------------------
 docs/hadoop_compatibility.md | 8 ++------
 1 file changed, 2 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-flink/blob/f945e2c9/docs/hadoop_compatibility.md
----------------------------------------------------------------------
diff --git a/docs/hadoop_compatibility.md b/docs/hadoop_compatibility.md
index 73b7f5e..59e8c51 100644
--- a/docs/hadoop_compatibility.md
+++ b/docs/hadoop_compatibility.md
@@ -114,12 +114,9 @@ hadoopOF.getConfiguration().set("mapreduce.output.textoutputformat.separator",
"
 TextOutputFormat.setOutputPath(job, new Path(outputPath));
 		
 // Emit data using the Hadoop TextOutputFormat.
-result.output(hadoopOF)
-      .setParallelism(1);
+result.output(hadoopOF);
 ~~~
 
-**Please note:** At the moment, Hadoop OutputFormats must be executed with a parallelism
of 1 (DOP = 1). This limitation will be resolved soon.
-
 ### Using Hadoop Mappers and Reducers
 
 Hadoop Mappers are semantically equivalent to Flink's [FlatMapFunctions](dataset_transformations.html#flatmap)
and Hadoop Reducers are equivalent to Flink's [GroupReduceFunctions](dataset_transformations.html#groupreduce-on-grouped-dataset).
Flink provides wrappers for implementations of Hadoop MapReduce's `Mapper` and `Reducer` interfaces,
i.e., you can reuse your Hadoop Mappers and Reducers in regular Flink programs. At the moment,
only the Mapper and Reduce interfaces of Hadoop's mapred API (`org.apache.hadoop.mapred`)
are supported.
@@ -192,8 +189,7 @@ hadoopOF.getConfiguration().set("mapreduce.output.textoutputformat.separator",
"
 TextOutputFormat.setOutputPath(job, new Path(outputPath));
 		
 // Emit data using the Hadoop TextOutputFormat.
-result.output(hadoopOF)
-      .setParallelism(1);
+result.output(hadoopOF);
 
 // Execute Program
 env.execute("Hadoop WordCount");


Mime
View raw message