flink-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rmetz...@apache.org
Subject flink git commit: [FLINK-4895] Drop Hadoop1 support and remove related build infrastructure
Date Wed, 30 Nov 2016 14:43:50 GMT
Repository: flink
Updated Branches:
  refs/heads/master ae0975c16 -> ac7d87158


[FLINK-4895] Drop Hadoop1 support and remove related build infrastructure

This closes #2850


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/ac7d8715
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/ac7d8715
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/ac7d8715

Branch: refs/heads/master
Commit: ac7d87158b3f2ebc72633fc8c925ded5fd408273
Parents: ae0975c
Author: Robert Metzger <rmetzger@apache.org>
Authored: Tue Nov 22 11:24:14 2016 +0100
Committer: Robert Metzger <rmetzger@apache.org>
Committed: Wed Nov 30 14:29:25 2016 +0100

----------------------------------------------------------------------
 .travis.yml                                     |   5 -
 docs/_config.yml                                |   1 -
 docs/setup/building.md                          |  33 +---
 .../flink-hadoop-compatibility/pom.xml          |   2 +-
 .../mapreduce/example/WordCount.java            |   1 -
 flink-batch-connectors/flink-hbase/pom.xml      | 128 +++++----------
 flink-dist/pom.xml                              |  24 +--
 flink-fs-tests/pom.xml                          |   2 +-
 flink-java/pom.xml                              |   2 +-
 flink-mesos/pom.xml                             |   2 +-
 .../main/resources/archetype-resources/pom.xml  |   1 -
 .../main/resources/archetype-resources/pom.xml  |   1 -
 flink-runtime/pom.xml                           |   2 +-
 .../runtime/fs/hdfs/HadoopDataOutputStream.java | 109 +------------
 .../flink-shaded-hadoop1/pom.xml                | 159 -------------------
 flink-shaded-hadoop/pom.xml                     |   8 +-
 .../flink-connector-filesystem/pom.xml          |   2 +-
 flink-streaming-connectors/pom.xml              |  16 +-
 flink-yarn-tests/pom.xml                        |   2 +-
 flink-yarn/pom.xml                              |   2 +-
 pom.xml                                         |  44 +----
 tools/create_release_files.sh                   |  11 --
 tools/deploy_to_maven.sh                        |  14 +-
 tools/generate_specific_pom.sh                  | 129 ---------------
 24 files changed, 63 insertions(+), 637 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/.travis.yml
----------------------------------------------------------------------
diff --git a/.travis.yml b/.travis.yml
index e15673e..1445f8d 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -35,11 +35,6 @@ matrix:
     - jdk: "oraclejdk7"
       env: PROFILE="-Dhadoop.version=2.3.0 -Pflink-fast-tests-b,include-kinesis"
 
-    - jdk: "openjdk7" 
-      env: PROFILE="-Dhadoop.profile=1 -Pflink-fast-tests-a,include-kinesis"
-    - jdk: "openjdk7" 
-      env: PROFILE="-Dhadoop.profile=1 -Pflink-fast-tests-b,include-kinesis"
-
 
 git:
   depth: 100

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/docs/_config.yml
----------------------------------------------------------------------
diff --git a/docs/_config.yml b/docs/_config.yml
index 1f9f157..b3d8d6f 100644
--- a/docs/_config.yml
+++ b/docs/_config.yml
@@ -27,7 +27,6 @@
 # we change the version for the complete docs when forking of a release branch
 # etc.
 version: "1.2-SNAPSHOT"
-version_hadoop1: "1.2-hadoop1-SNAPSHOT"
 version_short: "1.2" # Used for the top navbar w/o snapshot suffix
 is_snapshot_version: true
 

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/docs/setup/building.md
----------------------------------------------------------------------
diff --git a/docs/setup/building.md b/docs/setup/building.md
index c6ef5df..8c60997 100644
--- a/docs/setup/building.md
+++ b/docs/setup/building.md
@@ -76,31 +76,11 @@ mvn clean install
 
 ## Hadoop Versions
 
-{% info %} Most users do not need to do this manually. The [download page]({{ site.download_url
}})  contains binary packages for common Hadoop versions.
+{% info %} Most users do not need to do this manually. The [download page]({{ site.download_url
}}) contains binary packages for common Hadoop versions.
 
 Flink has dependencies to HDFS and YARN which are both dependencies from [Apache Hadoop](http://hadoop.apache.org).
There exist many different versions of Hadoop (from both the upstream project and the different
Hadoop distributions). If you are using a wrong combination of versions, exceptions can occur.
 
-There are two main versions of Hadoop that we need to differentiate:
-- **Hadoop 1**, with all versions starting with zero or one, like *0.20*, *0.23* or *1.2.1*.
-- **Hadoop 2**, with all versions starting with 2, like *2.6.0*.
-
-The main differentiation between Hadoop 1 and Hadoop 2 is the availability of [Hadoop YARN](https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/YARN.html),
Hadoop's cluster resource manager.
-
-**By default, Flink is using the Hadoop 2 dependencies**.
-
-### Hadoop 1
-
-To build Flink for Hadoop 1, issue the following command:
-
-~~~bash
-mvn clean install -DskipTests -Dhadoop.profile=1
-~~~
-
-The `-Dhadoop.profile=1` flag instructs Maven to build Flink for Hadoop 1. Note that the
features included in Flink change when using a different Hadoop profile. In particular, there
is no support for YARN and HBase in Hadoop 1 builds.
-
-### Hadoop 2.x
-
-Hadoop 2.X versions are only supported from version 2.3.0 upwards.
+Hadoop is only supported from version 2.3.0 upwards.
 You can also specify a specific Hadoop version to build against:
 
 ~~~bash
@@ -176,12 +156,3 @@ in the compiler configuration of the `pom.xml` file of the module causing
the er
 
 {% top %}
 
-## Internals
-
-The builds with Maven are controlled by [properties](http://maven.apache.org/pom.html#Properties)
and [build profiles](http://maven.apache.org/guides/introduction/introduction-to-profiles.html).
There are two profiles, one for `hadoop1` and one for `hadoop2`. When the `hadoop2` profile
is enabled (default), the system will also build the YARN client.
-
-To enable the `hadoop1` profile, set `-Dhadoop.profile=1` when building. Depending on the
profile, there are two Hadoop versions, set via properties. For `hadoop1`, we use 1.2.1 by
default, for `hadoop2` it is 2.3.0.
-
-You can change these versions with the `hadoop-two.version` (or `hadoop-one.version`) property.
For example `-Dhadoop-two.version=2.4.0`.
-
-{% top %}

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-batch-connectors/flink-hadoop-compatibility/pom.xml
----------------------------------------------------------------------
diff --git a/flink-batch-connectors/flink-hadoop-compatibility/pom.xml b/flink-batch-connectors/flink-hadoop-compatibility/pom.xml
index 8143a03..8f423d9 100644
--- a/flink-batch-connectors/flink-hadoop-compatibility/pom.xml
+++ b/flink-batch-connectors/flink-hadoop-compatibility/pom.xml
@@ -62,7 +62,7 @@ under the License.
 
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 		</dependency>
 

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-batch-connectors/flink-hadoop-compatibility/src/test/java/org/apache/flink/test/hadoopcompatibility/mapreduce/example/WordCount.java
----------------------------------------------------------------------
diff --git a/flink-batch-connectors/flink-hadoop-compatibility/src/test/java/org/apache/flink/test/hadoopcompatibility/mapreduce/example/WordCount.java
b/flink-batch-connectors/flink-hadoop-compatibility/src/test/java/org/apache/flink/test/hadoopcompatibility/mapreduce/example/WordCount.java
index 3de3f72..ed83d78 100644
--- a/flink-batch-connectors/flink-hadoop-compatibility/src/test/java/org/apache/flink/test/hadoopcompatibility/mapreduce/example/WordCount.java
+++ b/flink-batch-connectors/flink-hadoop-compatibility/src/test/java/org/apache/flink/test/hadoopcompatibility/mapreduce/example/WordCount.java
@@ -77,7 +77,6 @@ public class WordCount {
 		HadoopOutputFormat<Text, IntWritable> hadoopOutputFormat = new HadoopOutputFormat<Text,
IntWritable>(new TextOutputFormat<Text, IntWritable>(), job);
 		hadoopOutputFormat.getConfiguration().set("mapreduce.output.textoutputformat.separator",
" ");
 		hadoopOutputFormat.getConfiguration().set("mapred.textoutputformat.separator", " "); //
set the value for both, since this test
-		// is being executed with both types (hadoop1 and hadoop2 profile)
 		TextOutputFormat.setOutputPath(job, new Path(outputPath));
 		
 		// Output & Execute

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-batch-connectors/flink-hbase/pom.xml
----------------------------------------------------------------------
diff --git a/flink-batch-connectors/flink-hbase/pom.xml b/flink-batch-connectors/flink-hbase/pom.xml
index a1e0ad6..70a5692 100644
--- a/flink-batch-connectors/flink-hbase/pom.xml
+++ b/flink-batch-connectors/flink-hbase/pom.xml
@@ -34,8 +34,7 @@ under the License.
 	<packaging>jar</packaging>
 
 	<properties>
-		<hbase.hadoop1.version>0.98.22-hadoop1</hbase.hadoop1.version>
-		<hbase.hadoop2.version>1.2.3</hbase.hadoop2.version>
+		<hbase.version>1.2.3</hbase.version>
 	</properties>
 
 	<build>
@@ -85,7 +84,7 @@ under the License.
 
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 			<scope>provided</scope>
 		</dependency>
@@ -198,100 +197,51 @@ under the License.
 			</exclusions>
 		</dependency>
 
-	</dependencies>
-
-	<profiles>
-		<profile>
-			<id>hadoop-1</id>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop1' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop1--><name>hadoop.profile</name><value>1</value>
-				</property>
-			</activation>
-			<properties>
-				<hbase.version>${hbase.hadoop1.version}</hbase.version>
-				<!--
-				Required test dependencies are only available for Hadoop-2.
-				Disabling tests for Hadoop-1 profile.
-				-->
-				<maven.test.skip>true</maven.test.skip>
-			</properties>
-
-		</profile>
-
-		<profile>
-			<id>hadoop-2</id>
-			<repositories>
-				<repository>
-					<id>hadoop-2-repo2</id>
-					<url>https://repo.maven.apache.org/maven2</url>
-					<releases>
-						<enabled>true</enabled>
-					</releases>
-					<snapshots>
-						<enabled>false</enabled>
-					</snapshots>
-				</repository>
-			</repositories>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop2' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop2--><name>!hadoop.profile</name>
-				</property>
-			</activation>
-			<properties>
-				<hbase.version>${hbase.hadoop2.version}</hbase.version>
-			</properties>
-
-			<dependencies>
-				<!-- Test dependencies are only available for Hadoop-2. -->
-				<dependency>
-					<groupId>org.apache.hbase</groupId>
-					<artifactId>hbase-server</artifactId>
-					<version>${hbase.version}</version>
-					<classifier>tests</classifier>
-					<scope>test</scope>
-				</dependency>
-
-				<dependency>
-					<groupId>org.apache.hadoop</groupId>
-					<artifactId>hadoop-minicluster</artifactId>
-					<version>${hadoop.version}</version>
-					<scope>test</scope>
-				</dependency>
+		<!-- Test dependencies are only available for Hadoop-2. -->
+		<dependency>
+			<groupId>org.apache.hbase</groupId>
+			<artifactId>hbase-server</artifactId>
+			<version>${hbase.version}</version>
+			<classifier>tests</classifier>
+			<scope>test</scope>
+		</dependency>
 
-				<dependency>
-					<groupId>org.apache.hbase</groupId>
-					<artifactId>hbase-hadoop-compat</artifactId>
-					<version>${hbase.version}</version>
-					<scope>test</scope>
-					<type>test-jar</type>
-				</dependency>
+		<dependency>
+			<groupId>org.apache.hadoop</groupId>
+			<artifactId>hadoop-minicluster</artifactId>
+			<version>${hadoop.version}</version>
+			<scope>test</scope>
+		</dependency>
 
-				<dependency>
-					<groupId>org.apache.hadoop</groupId>
-					<artifactId>hadoop-hdfs</artifactId>
-					<version>${hadoop.version}</version>
-					<type>test-jar</type>
-					<scope>test</scope>
-				</dependency>
+		<dependency>
+			<groupId>org.apache.hbase</groupId>
+			<artifactId>hbase-hadoop-compat</artifactId>
+			<version>${hbase.version}</version>
+			<scope>test</scope>
+			<type>test-jar</type>
+		</dependency>
 
-				<dependency>
-					<groupId>org.apache.hbase</groupId>
-					<artifactId>hbase-hadoop2-compat</artifactId>
-					<version>${hbase.version}</version>
-					<scope>test</scope>
-					<type>test-jar</type>
-				</dependency>
-			</dependencies>
+		<dependency>
+			<groupId>org.apache.hadoop</groupId>
+			<artifactId>hadoop-hdfs</artifactId>
+			<version>${hadoop.version}</version>
+			<type>test-jar</type>
+			<scope>test</scope>
+		</dependency>
 
-		</profile>
+		<dependency>
+			<groupId>org.apache.hbase</groupId>
+			<artifactId>hbase-hadoop2-compat</artifactId>
+			<version>${hbase.version}</version>
+			<scope>test</scope>
+			<type>test-jar</type>
+		</dependency>
+	</dependencies>
 
+	<profiles>
 		<profile>
 			<id>cdh5.1.3</id>
 			<properties>
-				<hadoop.profile>2</hadoop.profile>
 				<hbase.version>0.98.1-cdh5.1.3</hbase.version>
 				<hadoop.version>2.3.0-cdh5.1.3</hadoop.version>
 				<!-- Cloudera use different versions for hadoop core and commons-->

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-dist/pom.xml
----------------------------------------------------------------------
diff --git a/flink-dist/pom.xml b/flink-dist/pom.xml
index 319f6af..93feec6 100644
--- a/flink-dist/pom.xml
+++ b/flink-dist/pom.xml
@@ -131,28 +131,16 @@ under the License.
 			<artifactId>flink-statebackend-rocksdb_2.10</artifactId>
 			<version>${project.version}</version>
 		</dependency>
+
+		<dependency>
+			<groupId>org.apache.flink</groupId>
+			<artifactId>flink-yarn_2.10</artifactId>
+			<version>${project.version}</version>
+		</dependency>
 		
 	</dependencies>
 
 	<profiles>
-		<!-- See main pom.xml for explanation of Hadoop profiles -->
-		<profile>
-			<id>include-yarn</id>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop2' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop2--><name>!hadoop.profile</name>
-				</property>
-			</activation>
-			<dependencies>
-				<dependency>
-					<groupId>org.apache.flink</groupId>
-					<artifactId>flink-yarn_2.10</artifactId>
-					<version>${project.version}</version>
-				</dependency>
-			</dependencies>
-		</profile>
-
 		<profile>
 			<!-- Creates/Removes the 'build-target' symlink in the root directory (only Unix systems)
-->
 			<id>symlink-build-target</id>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-fs-tests/pom.xml
----------------------------------------------------------------------
diff --git a/flink-fs-tests/pom.xml b/flink-fs-tests/pom.xml
index 4ccaaec..f480608 100644
--- a/flink-fs-tests/pom.xml
+++ b/flink-fs-tests/pom.xml
@@ -38,7 +38,7 @@ under the License.
 	<dependencies>
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 			<scope>test</scope>
 		</dependency>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-java/pom.xml
----------------------------------------------------------------------
diff --git a/flink-java/pom.xml b/flink-java/pom.xml
index 5bc81c6..728b6ee 100644
--- a/flink-java/pom.xml
+++ b/flink-java/pom.xml
@@ -43,7 +43,7 @@ under the License.
 		
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 		</dependency>
 

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-mesos/pom.xml
----------------------------------------------------------------------
diff --git a/flink-mesos/pom.xml b/flink-mesos/pom.xml
index 8814762..bf60f07 100644
--- a/flink-mesos/pom.xml
+++ b/flink-mesos/pom.xml
@@ -50,7 +50,7 @@ under the License.
 
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 		</dependency>
 

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml
----------------------------------------------------------------------
diff --git a/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml
b/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml
index e62d9d4..cf66bcf 100644
--- a/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml
+++ b/flink-quickstart/flink-quickstart-java/src/main/resources/archetype-resources/pom.xml
@@ -166,7 +166,6 @@ under the License.
 									Everything else will be packaged into the fat-jar
 									-->
 									<exclude>org.apache.flink:flink-annotations</exclude>
-									<exclude>org.apache.flink:flink-shaded-hadoop1</exclude>
 									<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
 									<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
 									<exclude>org.apache.flink:flink-core</exclude>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml
----------------------------------------------------------------------
diff --git a/flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml
b/flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml
index a62cf79..afb9b50 100644
--- a/flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml
+++ b/flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml
@@ -170,7 +170,6 @@ under the License.
 									Everything else will be packaged into the fat-jar
 									-->
 									<exclude>org.apache.flink:flink-annotations</exclude>
-									<exclude>org.apache.flink:flink-shaded-hadoop1</exclude>
 									<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
 									<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
 									<exclude>org.apache.flink:flink-core</exclude>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-runtime/pom.xml
----------------------------------------------------------------------
diff --git a/flink-runtime/pom.xml b/flink-runtime/pom.xml
index 69fdd21..eec75c9 100644
--- a/flink-runtime/pom.xml
+++ b/flink-runtime/pom.xml
@@ -52,7 +52,7 @@ under the License.
 		
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 		</dependency>
 

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-runtime/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopDataOutputStream.java
----------------------------------------------------------------------
diff --git a/flink-runtime/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopDataOutputStream.java
b/flink-runtime/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopDataOutputStream.java
index d6fbc19..8787181 100644
--- a/flink-runtime/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopDataOutputStream.java
+++ b/flink-runtime/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopDataOutputStream.java
@@ -19,8 +19,6 @@
 package org.apache.flink.runtime.fs.hdfs;
 
 import java.io.IOException;
-import java.lang.reflect.InvocationTargetException;
-import java.lang.reflect.Method;
 
 import org.apache.flink.core.fs.FSDataOutputStream;
 
@@ -57,78 +55,12 @@ public class HadoopDataOutputStream extends FSDataOutputStream {
 
 	@Override
 	public void flush() throws IOException {
-		if (HFLUSH_METHOD != null) {
-			try {
-				HFLUSH_METHOD.invoke(fdos);
-			}
-			catch (InvocationTargetException e) {
-				Throwable cause = e.getTargetException();
-				if (cause instanceof IOException) {
-					throw (IOException) cause;
-				}
-				else if (cause instanceof RuntimeException) {
-					throw (RuntimeException) cause;
-				}
-				else if (cause instanceof Error) {
-					throw (Error) cause;
-				}
-				else {
-					throw new IOException("Exception while invoking hflush()", cause);
-				}
-			}
-			catch (IllegalAccessException e) {
-				throw new IOException("Cannot invoke hflush()", e);
-			}
-		}
-		else if (HFLUSH_ERROR != null) {
-			if (HFLUSH_ERROR instanceof NoSuchMethodException) {
-				throw new UnsupportedOperationException("hflush() method is not available in this version
of Hadoop.");
-			}
-			else {
-				throw new IOException("Cannot access hflush() method", HFLUSH_ERROR);
-			}
-		}
-		else {
-			throw new UnsupportedOperationException("hflush() is not available in this version of
Hadoop.");
-		}
+		fdos.hflush();
 	}
 
 	@Override
 	public void sync() throws IOException {
-		if (HSYNC_METHOD != null) {
-			try {
-				HSYNC_METHOD.invoke(fdos);
-			}
-			catch (InvocationTargetException e) {
-				Throwable cause = e.getTargetException();
-				if (cause instanceof IOException) {
-					throw (IOException) cause;
-				}
-				else if (cause instanceof RuntimeException) {
-					throw (RuntimeException) cause;
-				}
-				else if (cause instanceof Error) {
-					throw (Error) cause;
-				}
-				else {
-					throw new IOException("Exception while invoking hsync()", cause);
-				}
-			}
-			catch (IllegalAccessException e) {
-				throw new IOException("Cannot invoke hsync()", e);
-			}
-		}
-		else if (HSYNC_ERROR != null) {
-			if (HSYNC_ERROR instanceof NoSuchMethodException) {
-				throw new UnsupportedOperationException("hsync() method is not available in this version
of Hadoop.");
-			}
-			else {
-				throw new IOException("Cannot access hsync() method", HSYNC_ERROR);
-			}
-		}
-		else {
-			throw new UnsupportedOperationException("hsync() is not available in this version of Hadoop.");
-		}
+		fdos.hsync();
 	}
 
 	/**
@@ -138,42 +70,5 @@ public class HadoopDataOutputStream extends FSDataOutputStream {
 	public org.apache.hadoop.fs.FSDataOutputStream getHadoopOutputStream() {
 		return fdos;
 	}
-	
-	// ------------------------------------------------------------------------
-	// utilities to bridge hsync and hflush to hadoop, even through it is not supported in Hadoop
1
-	// ------------------------------------------------------------------------
-	
-	private static final Method HFLUSH_METHOD;
-	private static final Method HSYNC_METHOD;
-	
-	private static final Throwable HFLUSH_ERROR;
-	private static final Throwable HSYNC_ERROR;
-	
-	static {
-		Method hflush = null;
-		Method hsync = null;
 
-		Throwable flushError = null;
-		Throwable syncError = null;
-		
-		try {
-			hflush = org.apache.hadoop.fs.FSDataOutputStream.class.getMethod("hflush");
-		}
-		catch (Throwable t) {
-			flushError = t;
-		}
-
-		try {
-			hsync = org.apache.hadoop.fs.FSDataOutputStream.class.getMethod("hsync");
-		}
-		catch (Throwable t) {
-			syncError = t;
-		}
-		
-		HFLUSH_METHOD = hflush;
-		HSYNC_METHOD = hsync;
-		
-		HFLUSH_ERROR = flushError;
-		HSYNC_ERROR = syncError;
-	}
 }

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-shaded-hadoop/flink-shaded-hadoop1/pom.xml
----------------------------------------------------------------------
diff --git a/flink-shaded-hadoop/flink-shaded-hadoop1/pom.xml b/flink-shaded-hadoop/flink-shaded-hadoop1/pom.xml
deleted file mode 100644
index 891ab7d..0000000
--- a/flink-shaded-hadoop/flink-shaded-hadoop1/pom.xml
+++ /dev/null
@@ -1,159 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-  http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
-
-	<modelVersion>4.0.0</modelVersion>
-
-	<parent>
-		<groupId>org.apache.flink</groupId>
-		<artifactId>flink-shaded-hadoop</artifactId>
-		<version>1.2-SNAPSHOT</version>
-		<relativePath>..</relativePath>
-	</parent>
-
-	<artifactId>flink-shaded-hadoop1</artifactId>
-	<name>flink-shaded-hadoop1</name>
-
-	<packaging>jar</packaging>
-
-	<dependencies>
-		<dependency>
-			<groupId>org.apache.hadoop</groupId>
-			<artifactId>hadoop-core</artifactId>
-			<version>${hadoop.version}</version>
-			<exclusions>
-				<exclusion>
-					<groupId>asm</groupId>
-					<artifactId>asm</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.ow2.asm</groupId>
-					<artifactId>asm</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>tomcat</groupId>
-					<artifactId>jasper-compiler</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>tomcat</groupId>
-					<artifactId>jasper-runtime</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.mortbay.jetty</groupId>
-					<artifactId>jetty</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.mortbay.jetty</groupId>
-					<artifactId>jsp-api-2.1</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.mortbay.jetty</groupId>
-					<artifactId>jsp-2.1</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.eclipse.jdt</groupId>
-					<artifactId>core</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>javax.servlet</groupId>
-					<artifactId>servlet-api</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey</groupId>
-					<artifactId>jersey-core</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey</groupId>
-					<artifactId>jersey-json</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.codehaus.jettison</groupId>
-					<artifactId>jettison</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey</groupId>
-					<artifactId>jersey-server</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>tomcat</groupId>
-					<artifactId>jasper-compiler</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>tomcat</groupId>
-					<artifactId>jasper-runtime</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>javax.servlet.jsp</groupId>
-					<artifactId>jsp-api</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey.jersey-test-framework</groupId>
-					<artifactId>jersey-test-framework-grizzly2</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey.jersey-test-framework</groupId>
-					<artifactId>jersey-test-framework-core</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>javax.servlet</groupId>
-					<artifactId>javax.servlet-api</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey</groupId>
-					<artifactId>jersey-client</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey</groupId>
-					<artifactId>jersey-grizzly2</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.glassfish.grizzly</groupId>
-					<artifactId>grizzly-http</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.glassfish.grizzly</groupId>
-					<artifactId>grizzly-framework</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.glassfish.grizzly</groupId>
-					<artifactId>grizzly-http-server</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.glassfish.grizzly</groupId>
-					<artifactId>grizzly-rcm</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.glassfish.grizzly</groupId>
-					<artifactId>grizzly-http-servlet</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>org.glassfish</groupId>
-					<artifactId>javax.servlet</artifactId>
-				</exclusion>
-				<exclusion>
-					<groupId>com.sun.jersey.contribs</groupId>
-					<artifactId>jersey-guice</artifactId>
-				</exclusion>
-			</exclusions>
-		</dependency>
-	</dependencies>
-
-</project>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-shaded-hadoop/pom.xml
----------------------------------------------------------------------
diff --git a/flink-shaded-hadoop/pom.xml b/flink-shaded-hadoop/pom.xml
index 47c1a7f..d1c8d49 100644
--- a/flink-shaded-hadoop/pom.xml
+++ b/flink-shaded-hadoop/pom.xml
@@ -35,7 +35,7 @@ under the License.
 	<packaging>pom</packaging>
 
 	<modules>
-		<module>${shading-artifact.name}</module>
+		<module>flink-shaded-hadoop2</module>
 	</modules>
 
 	<dependencies>
@@ -50,12 +50,6 @@ under the License.
 	<profiles>
 		<profile>
 			<id>include-yarn-tests</id>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop2' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop2--><name>!hadoop.profile</name>
-				</property>
-			</activation>
 			<modules>
 				<module>flink-shaded-include-yarn-tests</module>
 			</modules>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-streaming-connectors/flink-connector-filesystem/pom.xml
----------------------------------------------------------------------
diff --git a/flink-streaming-connectors/flink-connector-filesystem/pom.xml b/flink-streaming-connectors/flink-connector-filesystem/pom.xml
index ef7e72b..20c48c6 100644
--- a/flink-streaming-connectors/flink-connector-filesystem/pom.xml
+++ b/flink-streaming-connectors/flink-connector-filesystem/pom.xml
@@ -52,7 +52,7 @@ under the License.
 
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 			<scope>provided</scope>
 		</dependency>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-streaming-connectors/pom.xml
----------------------------------------------------------------------
diff --git a/flink-streaming-connectors/pom.xml b/flink-streaming-connectors/pom.xml
index 78e39ca..a4c27c2 100644
--- a/flink-streaming-connectors/pom.xml
+++ b/flink-streaming-connectors/pom.xml
@@ -48,25 +48,11 @@ under the License.
 		<module>flink-connector-nifi</module>
 		<module>flink-connector-cassandra</module>
 		<module>flink-connector-redis</module>
+		<module>flink-connector-filesystem</module>
 	</modules>
 
 	<!-- See main pom.xml for explanation of profiles -->
 	<profiles>
-		<profile>
-			<id>hadoop-2</id>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop2' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop2--><name>!hadoop.profile</name>
-				</property>
-			</activation>
-			<modules>
-				<!-- Include the flink-fs-tests project only for HD2.
-				 	The HDFS minicluster interfaces changed between the two versions.
-				 -->
-				<module>flink-connector-filesystem</module>
-			</modules>
-		</profile>
 		<!--
 			We include the kinesis module only optionally because it contains a dependency
 			licenced under the "Amazon Software License".

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-yarn-tests/pom.xml
----------------------------------------------------------------------
diff --git a/flink-yarn-tests/pom.xml b/flink-yarn-tests/pom.xml
index 68e4752..3c2bc67 100644
--- a/flink-yarn-tests/pom.xml
+++ b/flink-yarn-tests/pom.xml
@@ -81,7 +81,7 @@ under the License.
 
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 			<scope>test</scope>
 		</dependency>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/flink-yarn/pom.xml
----------------------------------------------------------------------
diff --git a/flink-yarn/pom.xml b/flink-yarn/pom.xml
index 9ce57d5..9a3cc8e 100644
--- a/flink-yarn/pom.xml
+++ b/flink-yarn/pom.xml
@@ -52,7 +52,7 @@ under the License.
 
 		<dependency>
 			<groupId>org.apache.flink</groupId>
-			<artifactId>${shading-artifact.name}</artifactId>
+			<artifactId>flink-shaded-hadoop2</artifactId>
 			<version>${project.version}</version>
 		</dependency>
 

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/pom.xml
----------------------------------------------------------------------
diff --git a/pom.xml b/pom.xml
index d9c2947..4e0dce9 100644
--- a/pom.xml
+++ b/pom.xml
@@ -76,17 +76,16 @@ under the License.
 		<module>flink-dist</module>
 		<module>flink-mesos</module>
 		<module>flink-metrics</module>
+		<module>flink-yarn</module>
+		<module>flink-fs-tests</module>
 	</modules>
 
 	<properties>
 		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
 		<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
-		<!-- Mutable name of the hadoop shading artifact. The module name can contain a scala
version suffix. -->
-		<shading-artifact.name>error</shading-artifact.name>
 		<!-- Internal property to reduce build times on TravisCi -->
 		<flink-fast-tests-pattern>never-match-me</flink-fast-tests-pattern>
-		<hadoop-one.version>1.2.1</hadoop-one.version>
-		<hadoop-two.version>2.3.0</hadoop-two.version>
+		<hadoop.version>2.3.0</hadoop.version>
 		<!-- Need to use a user property here because the surefire
 			 forkCount is not exposed as a property. With this we can set
 			 it on the "mvn" commandline in travis. -->
@@ -441,43 +440,6 @@ under the License.
 			</properties>
 		</profile>
 
-		<!-- Profile to switch to Hadoop 1 -->
-		<profile>
-			<id>hadoop-1</id>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop1' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop1--><name>hadoop.profile</name><value>1</value>
-				</property>
-			</activation>
-			<properties>
-				<hadoop.version>${hadoop-one.version}</hadoop.version>
-				<shading-artifact.name>flink-shaded-hadoop1</shading-artifact.name>
-			</properties>
-		</profile>
-
-		<!-- Default profile, which builds for Hadoop 2 -->
-		<profile>
-			<id>hadoop-2</id>
-			<activation>
-				<property>
-					<!-- Please do not remove the 'hadoop2' comment. See ./tools/generate_specific_pom.sh
-->
-					<!--hadoop2--><name>!hadoop.profile</name>
-				</property>
-			</activation>
-			<properties>
-				<hadoop.version>${hadoop-two.version}</hadoop.version>
-				<shading-artifact.name>flink-shaded-hadoop2</shading-artifact.name>
-			</properties>
-			<modules>
-				<module>flink-yarn</module>
-				<!-- Include the flink-fs-tests project only for HD2.
-				 	The HDFS minicluster interfaces changed between the two versions.
-				 -->
-				<module>flink-fs-tests</module>
-			</modules>
-		</profile>
-
 		<!-- Profile to deactivate the YARN tests -->
 		<profile>
 			<id>include-yarn-tests</id>

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/tools/create_release_files.sh
----------------------------------------------------------------------
diff --git a/tools/create_release_files.sh b/tools/create_release_files.sh
index 8a4b796..fdf50a5 100755
--- a/tools/create_release_files.sh
+++ b/tools/create_release_files.sh
@@ -70,7 +70,6 @@ OLD_VERSION=${OLD_VERSION:-1.1-SNAPSHOT}
 RELEASE_VERSION=${NEW_VERSION}
 RELEASE_CANDIDATE=${RELEASE_CANDIDATE:-rc1}
 RELEASE_BRANCH=${RELEASE_BRANCH:-master}
-NEW_VERSION_HADOOP1=${NEW_VERSION_HADOOP1:-"$RELEASE_VERSION-hadoop1"}
 USER_NAME=${USER_NAME:-yourapacheidhere}
 MVN=${MVN:-mvn}
 GPG=${GPG:-gpg}
@@ -115,7 +114,6 @@ make_source_release() {
   #change version of documentation
   cd docs
   perl -pi -e "s#^version: .*#version: ${NEW_VERSION}#" _config.yml
-  perl -pi -e "s#^version_hadoop1: .*#version_hadoop1: ${NEW_VERSION}-hadoop1#" _config.yml
   perl -pi -e "s#^version_short: .*#version_short: ${NEW_VERSION}#" _config.yml
   cd ..
 
@@ -185,14 +183,6 @@ deploy_to_maven() {
   echo "Deploying Scala 2.10 version"
   cd tools && ./change-scala-version.sh 2.10 && cd ..
   $MVN clean deploy -Dgpg.executable=$GPG -Prelease,docs-and-source --settings deploysettings.xml
-DskipTests -Dgpg.keyname=$GPG_KEY -Dgpg.passphrase=$GPG_PASSPHRASE -DretryFailedDeploymentCount=10
-
-
-  echo "Deploying Scala 2.10 / hadoop 1 version"
-  ../generate_specific_pom.sh $NEW_VERSION $NEW_VERSION_HADOOP1 pom.xml
-
-
-  sleep 4
-  $MVN clean deploy -Dgpg.executable=$GPG -Prelease,docs-and-source --settings deploysettings.xml
-DskipTests -Dgpg.keyname=$GPG_KEY -Dgpg.passphrase=$GPG_PASSPHRASE -DretryFailedDeploymentCount=10
 }
 
 copy_data() {
@@ -211,7 +201,6 @@ prepare
 
 make_source_release
 
-make_binary_release "hadoop1" "-Dhadoop.profile=1" 2.10
 make_binary_release "hadoop2" "" 2.10
 make_binary_release "hadoop24" "-Dhadoop.version=2.4.1" 2.10
 make_binary_release "hadoop26" "-Dhadoop.version=2.6.3" 2.10

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/tools/deploy_to_maven.sh
----------------------------------------------------------------------
diff --git a/tools/deploy_to_maven.sh b/tools/deploy_to_maven.sh
index 74ae8a7..676d8d0 100755
--- a/tools/deploy_to_maven.sh
+++ b/tools/deploy_to_maven.sh
@@ -75,13 +75,8 @@ rm -rf dummy-lifecycle-mapping-plugin
 
 
 CURRENT_FLINK_VERSION=`getVersion`
-if [[ "$CURRENT_FLINK_VERSION" == *-SNAPSHOT ]]; then
-    CURRENT_FLINK_VERSION_HADOOP1=${CURRENT_FLINK_VERSION/-SNAPSHOT/-hadoop1-SNAPSHOT}
-else
-    CURRENT_FLINK_VERSION_HADOOP1="$CURRENT_FLINK_VERSION-hadoop1"
-fi
 
-echo "detected current version as: '$CURRENT_FLINK_VERSION' ; hadoop1: $CURRENT_FLINK_VERSION_HADOOP1
"
+echo "detected current version as: '$CURRENT_FLINK_VERSION'"
 
 #
 # This script deploys our project to sonatype SNAPSHOTS.
@@ -92,13 +87,6 @@ if [[ $CURRENT_FLINK_VERSION == *SNAPSHOT* ]] ; then
     MVN_SNAPSHOT_OPTS="-B -Pdocs-and-source -DskipTests -Drat.skip=true -Drat.ignoreErrors=true
\
         -DretryFailedDeploymentCount=10 --settings deploysettings.xml clean deploy"
 
-    # Deploy hadoop v1 to maven
-    echo "Generating poms for hadoop1"
-    ./tools/generate_specific_pom.sh $CURRENT_FLINK_VERSION $CURRENT_FLINK_VERSION_HADOOP1
pom.hadoop1.xml
-    mvn -f pom.hadoop1.xml ${MVN_SNAPSHOT_OPTS}
-    # deploy to s3
-    deploy_to_s3 $CURRENT_FLINK_VERSION "hadoop1"
-
     # hadoop2 scala 2.10
     echo "deploy standard version (hadoop2) for scala 2.10"
     mvn ${MVN_SNAPSHOT_OPTS}

http://git-wip-us.apache.org/repos/asf/flink/blob/ac7d8715/tools/generate_specific_pom.sh
----------------------------------------------------------------------
diff --git a/tools/generate_specific_pom.sh b/tools/generate_specific_pom.sh
deleted file mode 100755
index 11feefd..0000000
--- a/tools/generate_specific_pom.sh
+++ /dev/null
@@ -1,129 +0,0 @@
-#!/usr/bin/env bash
-################################################################################
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional information
-#  regarding copyright ownership.  The ASF licenses this file
-#  to you under the Apache License, Version 2.0 (the
-#  "License"); you may not use this file except in compliance
-#  with the License.  You may obtain a copy of the License at
-#
-#      http://www.apache.org/licenses/LICENSE-2.0
-#
-#  Unless required by applicable law or agreed to in writing, software
-#  distributed under the License is distributed on an "AS IS" BASIS,
-#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#  See the License for the specific language governing permissions and
-# limitations under the License.
-################################################################################
-
-
-#
-# Inspired by and modified from:
-# https://github.com/apache/hbase/blob/trunk/dev-support/generate-hadoopX-poms.sh
-#
-
-function usage {
-  echo "Usage: $0 CURRENT_VERSION NEW_VERSION [POM_NAME]"
-  echo "For example, $0 0.8-incubating-SNAPSHOT 0.8-hadoop1-incubating-SNAPSHOT"
-  echo "Presumes VERSION has hadoop1 or hadoop2 in it. POM_NAME is optional and"
-  echo "allows to specify a different name for the generated pom."
-  exit 1
-}
-
-if [[ "$#" -lt 2 ]]; then usage; fi
-
-old_version="$1"
-new_version="$2"
-new_pom_name="$3"
-
-# Get hadoop version from the new Flink version
-hadoop_version=`echo "$new_version" | sed -n 's/.*\(hadoop[12]\).*/\1/p'`
-if [[ -z $hadoop_version ]]; then usage ; fi
-
-echo "hadoop version $hadoop_version"
-
-
-here="`dirname \"$0\"`"              # relative
-here="`( cd \"$here\" && pwd )`"  # absolutized and normalized
-if [ -z "$here" ] ; then
-  # error; for some reason, the path is not accessible
-  # to the script (e.g. permissions re-evaled after suid)
-  exit 1  # fail
-fi
-flink_home="`dirname \"$here\"`"
-
-
-hadoop1=
-hadoop2=
-default='<name>!hadoop.profile<\/name>'
-notdefault='<name>hadoop.profile<\/name>'
-case "${hadoop_version}" in
-  hadoop1)
-    hadoop1="${default}"
-    hadoop2="${notdefault}<value>2<\/value>"
-    ;;
-  hadoop2)
-    hadoop1="${notdefault}<value>1<\/value>"
-    hadoop2="${default}"
-    ;;
- *) echo "Unknown ${hadoop_version}"
-    usage
-    ;;
-esac
-
-nupom=$new_pom_name
-if [[ -z "$new_pom_name" ]]; then
-  nupom="pom.${hadoop_version}.xml"
-fi
-echo "Using $nupom as name for the generated pom file."
-
-# export relevant variables for find command subshells
-export old_version
-export new_version
-export hadoop1
-export hadoop2
-export nupom
-
-# paths may contain spaces
-find "$flink_home" -name pom.xml -exec bash -c '
-  
-  p="$0"
-
-  # write into tmp file because in-place replacement is not possible (if nupom="pom.xml")
-  tmp_nuname1="`dirname "$p"`/__generate_specific_pom_tmp1"
-  tmp_nuname2="`dirname "$p"`/__generate_specific_pom_tmp2"
-  nuname="`dirname "$p"`/${nupom}"
-  # Now we do search and replace of explicit strings.  The best way of
-  # seeing what the below does is by doing a diff between the original
-  # pom and the generated pom (pom.hadoop1.xml or pom.hadoop2.xml). We
-  # replace the version string in all poms, we change modules to
-  # include reference to the non- standard pom name, we adjust
-  # relative paths so child modules can find the parent pom, and we
-  # enable/disable hadoop 1 and hadoop 2 profiles as appropriate
-  # removing a comment string too. We output the new pom beside the
-  # original.
-
-  # To avoid accidentally replace version numbers in our dependencies 
-  # sharing the version number with the current release use the following.
-
-  echo "p=$p, old_version=${old_version}, new_version=$new_version"
-
-  perl -0777 -pe "s:<groupId>org.apache.flink</groupId>\n([\t ]*<artifactId>([a-z]+-)+[a-z0-9\.\_]+</artifactId>\n[\t
]*)<version>${old_version}</version>:<groupId>org.apache.flink</groupId>\n\1<version>${new_version}</version>:g"
"$p" > "$tmp_nuname1"
-
-  # replace the version also in the quickstart poms (so that the hadoop1 quickstart creates
an hadoop1 project)
-  perl -0777 -pe "s:<flink.version>${old_version}</flink.version>:<flink.version>${new_version}</flink.version>:g"
"$tmp_nuname1" > "$tmp_nuname2"
-
-  # Alternatively when no version collisions are present this is enough:
-  # sed -e "s/${old_version}/${new_version}/" $p > "$tmp_nuname1"
-
-  sed -e "s/\(<module>[^<]*\)/\1\/${nupom}/" \
-    -e "s/\(relativePath>\.\.\)/\1\/${nupom}/" \
-    -e "s/<!--hadoop1-->.*name>.*/${hadoop1}/" \
-    -e "s/<!--hadoop2-->.*name>.*/${hadoop2}/" \
-    "$tmp_nuname2" > "$tmp_nuname1"
-  rm "$tmp_nuname2"
-  mv "$tmp_nuname1" "$nuname"
-
-' "{}" \; # pass file name as argument
-


Mime
View raw message