flink-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From u..@apache.org
Subject [4/6] flink git commit: [docs] Move libraries to batch and streaming guides
Date Wed, 03 Feb 2016 18:23:41 GMT
http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/apis/batch/libs/ml/svm.md
----------------------------------------------------------------------
diff --git a/docs/apis/batch/libs/ml/svm.md b/docs/apis/batch/libs/ml/svm.md
new file mode 100644
index 0000000..6d09482
--- /dev/null
+++ b/docs/apis/batch/libs/ml/svm.md
@@ -0,0 +1,223 @@
+---
+mathjax: include
+title: SVM using CoCoA
+# Sub navigation
+sub-nav-group: batch
+sub-nav-parent: flinkml
+sub-nav-title: SVM (CoCoA)
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+* This will be replaced by the TOC
+{:toc}
+
+## Description
+
+Implements an SVM with soft-margin using the communication-efficient distributed dual coordinate
+ascent algorithm with hinge-loss function.
+The algorithm solves the following minimization problem:
+
+$$\min_{\mathbf{w} \in \mathbb{R}^d} \frac{\lambda}{2} \left\lVert \mathbf{w} \right\rVert^2 + \frac{1}{n} \sum_{i=1}^n l_{i}\left(\mathbf{w}^T\mathbf{x}_i\right)$$
+
+with $\mathbf{w}$ being the weight vector, $\lambda$ being the regularization constant,
+$$\mathbf{x}_i \in \mathbb{R}^d$$ being the data points and $$l_{i}$$ being the convex loss
+functions, which can also depend on the labels $$y_{i} \in \mathbb{R}$$.
+In the current implementation the regularizer is the $\ell_2$-norm and the loss functions are the hinge-loss functions:
+
+  $$l_{i} = \max\left(0, 1 - y_{i} \mathbf{w}^T\mathbf{x}_i \right)$$
+
+With these choices, the problem definition is equivalent to a SVM with soft-margin.
+Thus, the algorithm allows us to train a SVM with soft-margin.
+
+The minimization problem is solved by applying stochastic dual coordinate ascent (SDCA).
+In order to make the algorithm efficient in a distributed setting, the CoCoA algorithm calculates
+several iterations of SDCA locally on a data block before merging the local updates into a
+valid global state.
+This state is redistributed to the different data partitions where the next round of local SDCA
+iterations is then executed.
+The number of outer iterations and local SDCA iterations control the overall network costs, because
+there is only network communication required for each outer iteration.
+The local SDCA iterations are embarrassingly parallel once the individual data partitions have been
+distributed across the cluster.
+
+The implementation of this algorithm is based on the work of
+[Jaggi et al.](http://arxiv.org/abs/1409.1458)
+
+## Operations
+
+`SVM` is a `Predictor`.
+As such, it supports the `fit` and `predict` operation.
+
+### Fit
+
+SVM is trained given a set of `LabeledVector`:
+
+* `fit: DataSet[LabeledVector] => Unit`
+
+### Predict
+
+SVM predicts for all subtypes of FlinkML's `Vector` the corresponding class label:
+
+* `predict[T <: Vector]: DataSet[T] => DataSet[(T, Double)]`, where the `(T, Double)` tuple
+  corresponds to (original_features, label)
+
+If we call evaluate with a `DataSet[(Vector, Double)]`, we make a prediction on the class label
+for each example, and return a `DataSet[(Double, Double)]`. In each tuple the first element
+is the true value, as was provided from the input `DataSet[(Vector, Double)]` and the second element
+is the predicted value. You can then use these `(truth, prediction)` tuples to evaluate
+the algorithm's performance.
+
+* `predict: DataSet[(Vector, Double)] => DataSet[(Double, Double)]`
+
+## Parameters
+
+The SVM implementation can be controlled by the following parameters:
+
+<table class="table table-bordered">
+<thead>
+  <tr>
+    <th class="text-left" style="width: 20%">Parameters</th>
+    <th class="text-center">Description</th>
+  </tr>
+</thead>
+
+<tbody>
+  <tr>
+    <td><strong>Blocks</strong></td>
+    <td>
+      <p>
+        Sets the number of blocks into which the input data will be split.
+        On each block the local stochastic dual coordinate ascent method is executed.
+        This number should be set at least to the degree of parallelism.
+        If no value is specified, then the parallelism of the input DataSet is used as the number of blocks.
+        (Default value: <strong>None</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+    <td><strong>Iterations</strong></td>
+    <td>
+      <p>
+        Defines the maximum number of iterations of the outer loop method.
+        In other words, it defines how often the SDCA method is applied to the blocked data.
+        After each iteration, the locally computed weight vector updates have to be reduced to update the global weight vector value.
+        The new weight vector is broadcast to all SDCA tasks at the beginning of each iteration.
+        (Default value: <strong>10</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+    <td><strong>LocalIterations</strong></td>
+    <td>
+      <p>
+        Defines the maximum number of SDCA iterations.
+        In other words, it defines how many data points are drawn from each local data block to calculate the stochastic dual coordinate ascent.
+        (Default value: <strong>10</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+    <td><strong>Regularization</strong></td>
+    <td>
+      <p>
+        Defines the regularization constant of the SVM algorithm.
+        The higher the value, the smaller will the 2-norm of the weight vector be.
+        In case of a SVM with hinge loss this means that the SVM margin will be wider even though it might contain some false classifications.
+        (Default value: <strong>1.0</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+    <td><strong>Stepsize</strong></td>
+    <td>
+      <p>
+        Defines the initial step size for the updates of the weight vector.
+        The larger the step size is, the larger will be the contribution of the weight vector updates to the next weight vector value.
+        The effective scaling of the updates is $\frac{stepsize}{blocks}$.
+        This value has to be tuned in case that the algorithm becomes unstable.
+        (Default value: <strong>1.0</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+    <td><strong>ThresholdValue</strong></td>
+    <td>
+      <p>
+        Defines the limiting value for the decision function above which examples are labeled as
+        positive (+1.0). Examples with a decision function value below this value are classified
+        as negative (-1.0). In order to get the raw decision function values you need to indicate it by
+        using the OutputDecisionFunction parameter.  (Default value: <strong>0.0</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+    <td><strong>OutputDecisionFunction</strong></td>
+    <td>
+      <p>
+        Determines whether the predict and evaluate functions of the SVM should return the distance
+        to the separating hyperplane, or binary class labels. Setting this to true will 
+        return the raw distance to the hyperplane for each example. Setting it to false will 
+        return the binary class label (+1.0, -1.0) (Default value: <strong>false</strong>)
+      </p>
+    </td>
+  </tr>
+  <tr>
+  <td><strong>Seed</strong></td>
+  <td>
+    <p>
+      Defines the seed to initialize the random number generator.
+      The seed directly controls which data points are chosen for the SDCA method.
+      (Default value: <strong>Random Long Integer</strong>)
+    </p>
+  </td>
+</tr>
+</tbody>
+</table>
+
+## Examples
+
+{% highlight scala %}
+import org.apache.flink.api.scala._
+import org.apache.flink.ml.math.Vector
+import org.apache.flink.ml.common.LabeledVector
+import org.apache.flink.ml.classification.SVM
+import org.apache.flink.ml.RichExecutionEnvironment
+
+val pathToTrainingFile: String = ???
+val pathToTestingFile: String = ???
+val env = ExecutionEnvironment.getExecutionEnvironment
+
+// Read the training data set, from a LibSVM formatted file
+val trainingDS: DataSet[LabeledVector] = env.readLibSVM(pathToTrainingFile)
+
+// Create the SVM learner
+val svm = SVM()
+  .setBlocks(10)
+
+// Learn the SVM model
+svm.fit(trainingDS)
+
+// Read the testing data set
+val testingDS: DataSet[Vector] = env.readLibSVM(pathToTestingFile).map(_.vector)
+
+// Calculate the predictions for the testing data set
+val predictionDS: DataSet[(Vector, Double)] = svm.predict(testingDS)
+
+{% endhighlight %}

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/apis/batch/libs/table.md
----------------------------------------------------------------------
diff --git a/docs/apis/batch/libs/table.md b/docs/apis/batch/libs/table.md
new file mode 100644
index 0000000..ce4d016
--- /dev/null
+++ b/docs/apis/batch/libs/table.md
@@ -0,0 +1,398 @@
+---
+title: "Table API - Relational Queries"
+is_beta: true
+# Top navigation
+top-nav-group: libs
+top-nav-pos: 3
+top-nav-title: "Relational: Table"
+# Sub navigation
+sub-nav-group: batch
+sub-nav-parent: libs
+sub-nav-pos: 3
+sub-nav-title: Table
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+**The Table API an experimental feature**
+
+Flink provides an API that allows specifying operations using SQL-like expressions. Instead of
+manipulating `DataSet` or `DataStream` you work with `Table` on which relational operations can
+be performed.
+
+The following dependency must be added to your project when using the Table API:
+
+{% highlight xml %}
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-table{{ site.scala_version_suffix }}</artifactId>
+  <version>{{site.version }}</version>
+</dependency>
+{% endhighlight %}
+
+Note that the Table API is currently not part of the binary distribution. See linking with it for cluster execution [here]({{ site.baseurl }}/apis/cluster_execution.html#linking-with-modules-not-contained-in-the-binary-distribution).
+
+## Scala Table API
+
+The Table API can be enabled by importing `org.apache.flink.api.scala.table._`.  This enables
+implicit conversions that allow
+converting a DataSet or DataStream to a Table. This example shows how a DataSet can
+be converted, how relational queries can be specified and how a Table can be
+converted back to a DataSet:
+
+{% highlight scala %}
+import org.apache.flink.api.scala._
+import org.apache.flink.api.scala.table._
+
+case class WC(word: String, count: Int)
+val input = env.fromElements(WC("hello", 1), WC("hello", 1), WC("ciao", 1))
+val expr = input.toTable
+val result = expr.groupBy('word).select('word, 'count.sum as 'count).toDataSet[WC]
+{% endhighlight %}
+
+The expression DSL uses Scala symbols to refer to field names and we use code generation to
+transform expressions to efficient runtime code. Please note that the conversion to and from
+Tables only works when using Scala case classes or Flink POJOs. Please check out
+the [programming guide]({{ site.baseurl }}/apis/index.html) to learn the requirements for a class to be
+considered a POJO.
+
+This is another example that shows how you
+can join to Tables:
+
+{% highlight scala %}
+case class MyResult(a: String, d: Int)
+
+val input1 = env.fromElements(...).toTable('a, 'b)
+val input2 = env.fromElements(...).toTable('c, 'd)
+val joined = input1.join(input2).where("b = a && d > 42").select("a, d").toDataSet[MyResult]
+{% endhighlight %}
+
+Notice, how a DataSet can be converted to a Table by using `as` and specifying new
+names for the fields. This can also be used to disambiguate fields before a join operation. Also,
+in this example we see that you can also use Strings to specify relational expressions.
+
+Please refer to the Scaladoc (and Javadoc) for a full list of supported operations and a
+description of the expression syntax.
+
+## Java Table API
+
+When using Java, Tables can be converted to and from DataSet and DataStream using `TableEnvironment`.
+This example is equivalent to the above Scala Example:
+
+{% highlight java %}
+
+public class WC {
+
+  public WC(String word, int count) {
+    this.word = word; this.count = count;
+  }
+
+  public WC() {} // empty constructor to satisfy POJO requirements
+
+  public String word;
+  public int count;
+}
+
+...
+
+ExecutionEnvironment env = ExecutionEnvironment.createCollectionsEnvironment();
+TableEnvironment tableEnv = new TableEnvironment();
+
+DataSet<WC> input = env.fromElements(
+        new WC("Hello", 1),
+        new WC("Ciao", 1),
+        new WC("Hello", 1));
+
+Table table = tableEnv.fromDataSet(input);
+
+Table filtered = table
+        .groupBy("word")
+        .select("word.count as count, word")
+        .filter("count = 2");
+
+DataSet<WC> result = tableEnv.toDataSet(filtered, WC.class);
+{% endhighlight %}
+
+When using Java, the embedded DSL for specifying expressions cannot be used. Only String expressions
+are supported. They support exactly the same feature set as the expression DSL.
+
+## Table API Operators
+Table API provide a domain-spcific language to execute language-integrated queries on structured data in Scala and Java.
+This section gives a brief overview of all available operators. You can find more details of operators in the [Javadoc]({{site.baseurl}}/api/java/org/apache/flink/api/table/Table.html).
+
+<div class="codetabs" markdown="1">
+<div data-lang="java" markdown="1">
+
+<br />
+
+<table class="table table-bordered">
+  <thead>
+    <tr>
+      <th class="text-left" style="width: 20%">Operators</th>
+      <th class="text-center">Description</th>
+    </tr>
+  </thead>
+
+  <tbody>
+    <tr>
+      <td><strong>Select</strong></td>
+      <td>
+        <p>Similar to a SQL SELECT statement. Perform a select operation.</p>
+{% highlight java %}
+Table in = tableEnv.fromDataSet(ds, "a, b, c");
+Table result = in.select("a, c as d");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>As</strong></td>
+      <td>
+        <p>Rename fields.</p>
+{% highlight java %}
+Table in = tableEnv.fromDataSet(ds, "a, b, c");
+Table result = in.as("d, e, f");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Filter</strong></td>
+      <td>
+        <p>Similar to a SQL WHERE clause. Filter out elements that do not pass the filter predicate.</p>
+{% highlight java %}
+Table in = tableEnv.fromDataSet(ds, "a, b, c");
+Table result = in.filter("a % 2 = 0");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Where</strong></td>
+      <td>
+        <p>Similar to a SQL WHERE clause. Filter out elements that do not pass the filter predicate.</p>
+{% highlight java %}
+Table in = tableEnv.fromDataSet(ds, "a, b, c");
+Table result = in.where("b = 'red'");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>GroupBy</strong></td>
+      <td>
+        <p>Similar to a SQL GROUPBY clause. Group the elements on the grouping keys, with a following aggregation
+        operator to aggregate on per-group basis.</p>
+{% highlight java %}
+Table in = tableEnv.fromDataSet(ds, "a, b, c");
+Table result = in.groupBy("a").select("a, b.sum as d");
+{% endhighlight %}
+        <p><i>Note:</i> Flink can refer to nonaggregated columns in the select list that are not named in
+        the groupBy clause, it could be used to get better performance by avoiding unnecessary column sorting and
+        grouping while nonaggregated column is cogrouped with columns in groupBy clause. For example:</p>
+{% highlight java %}
+Table in = tableEnv.fromDataSet(ds, "a, b, c");
+Table result = in.groupBy("a").select("a, b, c.sum as d");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Join</strong></td>
+      <td>
+        <p>Similar to a SQL JOIN clause. Join two tables, both tables must have distinct field name, and the where
+        clause is mandatory for join condition.</p>
+{% highlight java %}
+Table left = tableEnv.fromDataSet(ds1, "a, b, c");
+Table right = tableEnv.fromDataSet(ds2, "d, e, f");
+Table result = left.join(right).where("a = d").select("a, b, e");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Union</strong></td>
+      <td>
+        <p>Similar to a SQL UNION ALL clause. Union two tables, both tables must have identical schema(field names and types).</p>
+{% highlight java %}
+Table left = tableEnv.fromDataSet(ds1, "a, b, c");
+Table right = tableEnv.fromDataSet(ds2, "a, b, c");
+Table result = left.union(right);
+{% endhighlight %}
+      </td>
+    </tr>
+
+  </tbody>
+</table>
+
+</div>
+<div data-lang="scala" markdown="1">
+<br />
+
+<table class="table table-bordered">
+  <thead>
+    <tr>
+      <th class="text-left" style="width: 20%">Operators</th>
+      <th class="text-center">Description</th>
+    </tr>
+  </thead>
+
+  <tbody>
+    <tr>
+      <td><strong>Select</strong></td>
+      <td>
+        <p>Similar to a SQL SELECT statement. Perform a select operation.</p>
+{% highlight scala %}
+val in = ds.as('a, 'b, 'c);
+val result = in.select('a, 'c as 'd);
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>As</strong></td>
+      <td>
+        <p>Rename fields.</p>
+{% highlight scala %}
+val in = ds.as('a, 'b, 'c);
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Filter</strong></td>
+      <td>
+        <p>Similar to a SQL WHERE clause. Filter out elements that do not pass the filter predicate.</p>
+{% highlight scala %}
+val in = ds.as('a, 'b, 'c);
+val result = in.filter('a % 2 === 0)
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Where</strong></td>
+      <td>
+        <p>Similar to a SQL WHERE clause. Filter out elements that do not pass the filter predicate.</p>
+{% highlight scala %}
+val in = ds.as('a, 'b, 'c);
+val result = in.where('b === "red");
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>GroupBy</strong></td>
+      <td>
+        <p>Similar to a SQL GROUPBY clause. Group the elements on the grouping keys, with a following aggregation
+        operator to aggregate on per-group basis.</p>
+{% highlight scala %}
+val in = ds.as('a, 'b, 'c);
+val result = in.groupBy('a).select('a, 'b.sum as 'd);
+{% endhighlight %}
+        <p><i>Note:</i> Flink can refer to nonaggregated columns in the select list that are not named in
+        the groupBy clause, it could be used to get better performance by avoiding unnecessary column sorting and
+        grouping while nonaggregated column is cogrouped with columns in groupBy clause. For example:</p>
+{% highlight scala %}
+val in = ds.as('a, 'b, 'c);
+val result = in.groupBy('a).select('a, 'b, 'c.sum as 'd);
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Join</strong></td>
+      <td>
+        <p>Similar to a SQL JOIN clause. Join two tables, both tables must have distinct field name, and the where
+        clause is mandatory for join condition.</p>
+{% highlight scala %}
+val left = ds1.as('a, 'b, 'c);
+val right = ds2.as('d, 'e, 'f);
+val result = left.join(right).where('a === 'd).select('a, 'b, 'e);
+{% endhighlight %}
+      </td>
+    </tr>
+
+    <tr>
+      <td><strong>Union</strong></td>
+      <td>
+        <p>Similar to a SQL UNION ALL clause. Union two tables, both tables must have identical schema(field names and types).</p>
+{% highlight scala %}
+val left = ds1.as('a, 'b, 'c);
+val right = ds2.as('a, 'b, 'c);
+val result = left.union(right);
+{% endhighlight %}
+      </td>
+    </tr>
+  </tbody>
+</table>
+</div>
+</div>
+
+## Expression Syntax
+Some of operators in previous section expect an expression. These can either be specified using an embedded Scala DSL or
+a String expression. Please refer to the examples above to learn how expressions can be
+formulated.
+
+This is the complete EBNF grammar for expressions:
+
+{% highlight ebnf %}
+
+expression = single expression , { "," , single expression } ;
+
+single expression = alias | logic ;
+
+alias = logic | logic , "AS" , field reference ;
+
+logic = comparison , [ ( "&&" | "||" ) , comparison ] ;
+
+comparison = term , [ ( "=" | "!=" | ">" | ">=" | "<" | "<=" ) , term ] ;
+
+term = product , [ ( "+" | "-" ) , product ] ;
+
+product = binary bitwise , [ ( "*" | "/" | "%" ) , binary bitwise ] ;
+
+binary bitwise = unary , [ ( "&" | "!" | "^" ) , unary ] ;
+
+unary = [ "!" | "-" | "~" ] , suffix ;
+
+suffix = atom | aggregation | cast | as | substring ;
+
+aggregation = atom , [ ".sum" | ".min" | ".max" | ".count" | ".avg" ] ;
+
+cast = atom , ".cast(" , data type , ")" ;
+
+data type = "BYTE" | "SHORT" | "INT" | "LONG" | "FLOAT" | "DOUBLE" | "BOOL" | "BOOLEAN" | "STRING" | "DATE" ;
+
+as = atom , ".as(" , field reference , ")" ;
+
+substring = atom , ".substring(" , substring start , ["," substring end] , ")" ;
+
+substring start = single expression ;
+
+substring end = single expression ;
+
+atom = ( "(" , single expression , ")" ) | literal | field reference ;
+
+{% endhighlight %}
+
+Here, `literal` is a valid Java literal and `field reference` specifies a column in the data. The
+column names follow Java identifier syntax.
+
+Only the types `LONG` and `STRING` can be casted to `DATE` and vice versa. A `LONG` casted to `DATE` must be a milliseconds timestamp. A `STRING` casted to `DATE` must have the format "`yyyy-MM-dd HH:mm:ss.SSS`", "`yyyy-MM-dd`", "`HH:mm:ss`", or a milliseconds timestamp. By default, all timestamps refer to the UTC timezone beginning from January 1, 1970, 00:00:00 in milliseconds.

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/apis/streaming/libs/cep.md
----------------------------------------------------------------------
diff --git a/docs/apis/streaming/libs/cep.md b/docs/apis/streaming/libs/cep.md
new file mode 100644
index 0000000..aa23876
--- /dev/null
+++ b/docs/apis/streaming/libs/cep.md
@@ -0,0 +1,300 @@
+---
+title: "FlinkCEP - Complex event processing for Flink"
+# Top navigation
+top-nav-group: libs
+top-nav-pos: 2
+top-nav-title: CEP
+# Sub navigation
+sub-nav-group: streaming
+sub-nav-id: cep
+sub-nav-pos: 1
+sub-nav-parent: libs
+sub-nav-title: CEP
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+FlinkCEP is the complex event processing library for Flink.
+It allows you to easily detect complex event patterns in a stream of endless data.
+Complex events can then be constructed from matching sequences.
+This gives you the opportunity to quickly get hold of what's really important in your data.
+
+## Getting Started
+
+If you want to jump right in, you have to [set up a Flink program]({{ site.baseurl }}/apis/batch/index.html#linking-with-flink).
+Next, you have to add the FlinkCEP dependency to the `pom.xml` of your project.
+
+{% highlight xml %}
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-cep{{ site.scala_version_suffix }}</artifactId>
+  <version>{{site.version }}</version>
+</dependency>
+{% endhighlight %}
+
+Note that FlinkCEP is currently not part of the binary distribution.
+See linking with it for cluster execution [here]({{site.baseurl}}/apis/cluster_execution.html#linking-with-modules-not-contained-in-the-binary-distribution).
+
+Now you can start writing your first CEP program using the pattern API.
+
+{% highlight java %}
+DataStream<Event> input = ...
+
+Pattern<Event, ?> pattern = Pattern.begin("start").where(evt -> evt.getId() == 42)
+    .next("middle").subtype(SubEvent.class).where(subEvt -> subEvt.getVolume() >= 10.0)
+    .followedBy("end").where(evt -> evt.getName().equals("end"));
+    
+PatternStream<Event> patternStream = CEP.from(input, pattern);
+
+DataStream<Alert> result = patternStream.select(pattern -> {
+    return createAlertFrom(pattern);
+});
+{% endhighlight %}
+
+Note that we have used Java 8 lambdas here to make the example more succinct.
+
+## The Pattern API
+
+The pattern API allows you to quickly define complex event patterns.
+
+Each pattern consists of multiple stages or what we call states.
+In order to go from one state to the next, the user can specify conditions.
+These conditions can be the contiguity of events or a filter condition on an event.
+
+Each pattern has to start with an initial state:
+
+{% highlight java %}
+Pattern<Event, ?> start = Pattern.<Event>begin("start");
+{% endhighlight %}
+
+Each state must have an unique name to identify the matched events later on.
+Additionally, we can specify a filter condition for the event to be accepted as the start event via the `where` method.
+  
+{% highlight java %}
+start.where(new FilterFunction<Event>() {
+    @Override
+    public boolean filter(Event value) {
+        return ... // some condition
+    }
+});
+{% endhighlight %}
+
+We can also restrict the type of the accepted event to some subtype of the initial event type (here `Event`) via the `subtype` method.
+
+{% highlight java %}
+start.subtype(SubEvent.class).where(new FilterFunction<SubEvent>() {
+    @Override
+    public boolean filter(SubEvent value) {
+        return ... // some condition
+    }
+});
+{% endhighlight %}
+
+As it can be seen here, the subtype condition can also be combined with an additional filter condition on the subtype.
+In fact you can always provide multiple conditions by calling `where` and `subtype` multiple times.
+These conditions will then be combined using the logical AND operator.
+
+Next, we can append further states to detect complex patterns.
+We can control the contiguity of two succeeding events to be accepted by the pattern.
+
+Strict contiguity means that two matching events have to succeed directly.
+This means that no other events can occur in between.
+A strict contiguity pattern state can be created via the `next` method.
+
+{% highlight java %}
+Pattern<Event, ?> strictNext = start.next("middle");
+{% endhighlight %}
+
+Non-strict contiguity means that other events are allowed to occur in-between two matching events.
+A non-strict contiguity pattern state can be created via the `followedBy` method.
+
+It is also possible to define a temporal constraint for the pattern to be valid.
+For example, one can define that a pattern should occur within 10 seconds via the `within` method.
+
+{% highlight java %}
+next.within(Time.seconds(10));
+{% endhighlight %}
+
+{% highlight java %}
+Pattern<Event, ?> nonStrictNext = start.followedBy("middle");
+{% endhighlight %}
+
+<br />
+
+<table class="table table-bordered">
+    <thead>
+        <tr>
+            <th class="text-left" style="width: 25%">Pattern Operation</th>
+            <th class="text-center">Description</th>
+        </tr>
+    </thead>
+    <tbody>
+        <tr>
+            <td><strong>Begin</strong></td>
+            <td>
+            <p>Defines a starting pattern state:</p>
+    {% highlight java %}
+    Pattern<Event, ?> start = Pattern.<Event>begin("start");
+    {% endhighlight %}
+            </td>
+        </tr>
+        <tr>
+            <td><strong>Next</strong></td>
+            <td>
+                <p>Appends a new pattern state. A matching event has to directly succeed the previous matching event:</p>
+{% highlight java %}
+Pattern<Event, ?> next = start.next("next");
+{% endhighlight %}
+            </td>
+        </tr>
+        <tr>
+            <td><strong>FollowedBy</strong></td>
+            <td>
+                <p>Appends a new pattern state. Other events can occur between a matching event and the previous matching event:</p>
+{% highlight java %}
+Pattern<Event, ?> next = start.followedBy("next");
+{% endhighlight %}
+            </td>
+        </tr>
+        <tr>
+            <td><strong>Where</strong></td>
+            <td>
+                <p>Defines a filter condition for the current pattern state. Only if an event passes the filter, it can match the state:</p>
+{% highlight java %}
+patternState.where(new FilterFunction<Event>() {
+    @Override
+    public boolean filter(Event value) throws Exception {
+        return ... // some condition
+    }
+});
+{% endhighlight %}
+            </td>
+        </tr>
+       <tr>
+           <td><strong>Subtype</strong></td>
+           <td>
+               <p>Defines a subtype condition for the current pattern state. Only if an event is of this subtype, it can match the state:</p>
+{% highlight java %}
+patternState.subtype(SubEvent.class);
+{% endhighlight %}
+           </td>
+       </tr>
+       <tr>
+          <td><strong>Within</strong></td>
+          <td>
+              <p>Defines the maximum time interval for an event sequence to match the pattern. If a non-completed event sequence exceeds this time, it is discarded:</p>
+{% highlight java %}
+patternState.within(Time.seconds(10));
+{% endhighlight %}
+          </td>
+      </tr>
+  </tbody>
+</table>
+
+### Detecting Patterns
+
+In order to run a stream of events against your pattern, you have to create a `PatternStream`.
+Given an input stream `input` and a pattern `pattern`, you create the `PatternStream` by calling
+
+{% highlight java %}
+DataStream<Event> input = ...
+Pattern<Event, ?> pattern = ...
+
+PatternStream<Event> patternStream = CEP.from(input, pattern);
+{% endhighlight %}
+
+### Selecting from Patterns
+
+Once you have obtained a `PatternStream` you can select from detected event sequences via the `select` or `flatSelect` methods.
+The `select` method requires a `PatternSelectFunction` implementation.
+A `PatternSelectFunction` has a `select` method which is called for each matching event sequence.
+It receives a map of string/event pairs of the matched events.
+The string is defined by the name of the state to which the event has been matched.
+The `select` method can return exactly one result.
+
+{% highlight java %}
+class MyPatternSelectFunction<IN, OUT> implements PatternSelectFunction<IN, OUT> {
+    @Override
+    public OUT select(Map<String, IN> pattern) {
+        IN startEvent = pattern.get("start");
+        IN endEvent = pattern.get("end");
+        
+        return new OUT(startEvent, endEvent);
+    }
+}
+{% endhighlight %}
+
+A `PatternFlatSelectFunction` is similar to the `PatternSelectFunction`, with the only distinction that it can return an arbitrary number of results.
+In order to do this, the `select` method has an additional `Collector` parameter which is used for the element output.
+
+{% highlight java %}
+class MyPatternFlatSelectFunction<IN, OUT> implements PatternFlatSelectFunction<IN, OUT> {
+    @Override
+    public void select(Map<String, IN> pattern, Collector<OUT> collector) {
+        IN startEvent = pattern.get("start");
+        IN endEvent = pattern.get("end");
+        
+        for (int i = 0; i < startEvent.getValue(); i++ ) {
+            collector.collect(new OUT(startEvent, endEvent));
+        }
+    }
+}
+{% endhighlight %}
+
+## Examples
+
+The following example detects the pattern `start, middle(name = "error") -> end(name = "critical")` on a keyed data stream of `Events`.
+The events are keyed by their ids and a valid pattern has to occur within 10 seconds.
+The whole processing is done with event time.
+
+{% highlight java %}
+StreamExecutionEnvironment env = ...
+env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
+
+DataStream<Event> input = ...
+
+DataStream<Event> partitionedInput = input.keyBy(new KeySelector<Event, Integer>() {
+	@Override
+	public Integer getKey(Event value) throws Exception {
+		return value.getId();
+	}
+});
+
+Pattern<Event, ?> pattern = Pattern.<Event>begin("start")
+	.next("middle").where(new FilterFunction<Event>() {
+		@Override
+		public boolean filter(Event value) throws Exception {
+			return value.getName().equals("name");
+		}
+	}).followedBy("end").where(new FilterFunction<Event>() {
+		@Override
+		public boolean filter(Event value) throws Exception {
+			return value.getName().equals("critical");
+		}
+	}).within(Time.seconds(10));
+
+PatternStream<Event> patternStream = CEP.from(partitionedInput, pattern);
+
+DataStream<Alert> alerts = patternStream.select(new PatternSelectFunction<Event, Alert>() {
+	@Override
+	public Alert select(Map<String, Event> pattern) throws Exception {
+		return new Alert(pattern.get("start"), pattern.get("end"))
+	}
+});
+{% endhighlight %}

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/apis/streaming/libs/index.md
----------------------------------------------------------------------
diff --git a/docs/apis/streaming/libs/index.md b/docs/apis/streaming/libs/index.md
new file mode 100644
index 0000000..a7362a6
--- /dev/null
+++ b/docs/apis/streaming/libs/index.md
@@ -0,0 +1,27 @@
+---
+title: "Streaming Libraries"
+sub-nav-group: streaming
+sub-nav-id: libs
+sub-nav-pos: 4
+sub-nav-title: Libraries
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+- Complex event processing: [CEP](cep.html)

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/apis/streaming/savepoints.md
----------------------------------------------------------------------
diff --git a/docs/apis/streaming/savepoints.md b/docs/apis/streaming/savepoints.md
index f04845c..63f0980 100644
--- a/docs/apis/streaming/savepoints.md
+++ b/docs/apis/streaming/savepoints.md
@@ -2,7 +2,7 @@
 title: "Savepoints"
 is_beta: false
 sub-nav-group: streaming
-sub-nav-pos: 4
+sub-nav-pos: 5
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/apis/streaming/storm_compatibility.md
----------------------------------------------------------------------
diff --git a/docs/apis/streaming/storm_compatibility.md b/docs/apis/streaming/storm_compatibility.md
index a777a40..d646040 100644
--- a/docs/apis/streaming/storm_compatibility.md
+++ b/docs/apis/streaming/storm_compatibility.md
@@ -2,7 +2,7 @@
 title: "Storm Compatibility"
 is_beta: true
 sub-nav-group: streaming
-sub-nav-pos: 5
+sub-nav-pos: 6
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/cep/index.md
----------------------------------------------------------------------
diff --git a/docs/libs/cep/index.md b/docs/libs/cep/index.md
index 04e2b73..e84e7d7 100644
--- a/docs/libs/cep/index.md
+++ b/docs/libs/cep/index.md
@@ -1,15 +1,5 @@
 ---
 title: "FlinkCEP - Complex event processing for Flink"
-# Top navigation
-top-nav-group: libs
-top-nav-pos: 2
-top-nav-title: CEP
-# Sub navigation
-sub-nav-group: batch
-sub-nav-id: flinkcep
-sub-nav-pos: 2
-sub-nav-parent: libs
-sub-nav-title: CEP
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
@@ -30,271 +20,6 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-FlinkCEP is the complex event processing library for Flink.
-It allows you to easily detect complex event patterns in a stream of endless data.
-Complex events can then be constructed from matching sequences.
-This gives you the opportunity to quickly get hold of what's really important in your data.
+<meta http-equiv="refresh" content="1; url={{ site.baseurl }}/apis/streaming/libs/cep.html" />
 
-## Getting Started
-
-If you want to jump right in, you have to [set up a Flink program]({{ site.baseurl }}/apis/batch/index.html#linking-with-flink).
-Next, you have to add the FlinkCEP dependency to the `pom.xml` of your project.
-
-{% highlight xml %}
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-cep{{ site.scala_version_suffix }}</artifactId>
-  <version>{{site.version }}</version>
-</dependency>
-{% endhighlight %}
-
-Note that FlinkCEP is currently not part of the binary distribution.
-See linking with it for cluster execution [here]({{site.baseurl}}/apis/cluster_execution.html#linking-with-modules-not-contained-in-the-binary-distribution).
-
-Now you can start writing your first CEP program using the pattern API.
-
-{% highlight java %}
-DataStream<Event> input = ...
-
-Pattern<Event, ?> pattern = Pattern.begin("start").where(evt -> evt.getId() == 42)
-    .next("middle").subtype(SubEvent.class).where(subEvt -> subEvt.getVolume() >= 10.0)
-    .followedBy("end").where(evt -> evt.getName().equals("end"));
-    
-PatternStream<Event> patternStream = CEP.from(input, pattern);
-
-DataStream<Alert> result = patternStream.select(pattern -> {
-    return createAlertFrom(pattern);
-});
-{% endhighlight %}
-
-Note that we have used Java 8 lambdas here to make the example more succinct.
-
-## The Pattern API
-
-The pattern API allows you to quickly define complex event patterns.
-
-Each pattern consists of multiple stages or what we call states.
-In order to go from one state to the next, the user can specify conditions.
-These conditions can be the contiguity of events or a filter condition on an event.
-
-Each pattern has to start with an initial state:
-
-{% highlight java %}
-Pattern<Event, ?> start = Pattern.<Event>begin("start");
-{% endhighlight %}
-
-Each state must have an unique name to identify the matched events later on.
-Additionally, we can specify a filter condition for the event to be accepted as the start event via the `where` method.
-  
-{% highlight java %}
-start.where(new FilterFunction<Event>() {
-    @Override
-    public boolean filter(Event value) {
-        return ... // some condition
-    }
-});
-{% endhighlight %}
-
-We can also restrict the type of the accepted event to some subtype of the initial event type (here `Event`) via the `subtype` method.
-
-{% highlight java %}
-start.subtype(SubEvent.class).where(new FilterFunction<SubEvent>() {
-    @Override
-    public boolean filter(SubEvent value) {
-        return ... // some condition
-    }
-});
-{% endhighlight %}
-
-As it can be seen here, the subtype condition can also be combined with an additional filter condition on the subtype.
-In fact you can always provide multiple conditions by calling `where` and `subtype` multiple times.
-These conditions will then be combined using the logical AND operator.
-
-Next, we can append further states to detect complex patterns.
-We can control the contiguity of two succeeding events to be accepted by the pattern.
-
-Strict contiguity means that two matching events have to succeed directly.
-This means that no other events can occur in between.
-A strict contiguity pattern state can be created via the `next` method.
-
-{% highlight java %}
-Pattern<Event, ?> strictNext = start.next("middle");
-{% endhighlight %}
-
-Non-strict contiguity means that other events are allowed to occur in-between two matching events.
-A non-strict contiguity pattern state can be created via the `followedBy` method.
-
-It is also possible to define a temporal constraint for the pattern to be valid.
-For example, one can define that a pattern should occur within 10 seconds via the `within` method.
-
-{% highlight java %}
-next.within(Time.seconds(10));
-{% endhighlight %}
-
-{% highlight java %}
-Pattern<Event, ?> nonStrictNext = start.followedBy("middle");
-{% endhighlight %}
-
-<br />
-
-<table class="table table-bordered">
-    <thead>
-        <tr>
-            <th class="text-left" style="width: 25%">Pattern Operation</th>
-            <th class="text-center">Description</th>
-        </tr>
-    </thead>
-    <tbody>
-        <tr>
-            <td><strong>Begin</strong></td>
-            <td>
-            <p>Defines a starting pattern state:</p>
-    {% highlight java %}
-    Pattern<Event, ?> start = Pattern.<Event>begin("start");
-    {% endhighlight %}
-            </td>
-        </tr>
-        <tr>
-            <td><strong>Next</strong></td>
-            <td>
-                <p>Appends a new pattern state. A matching event has to directly succeed the previous matching event:</p>
-{% highlight java %}
-Pattern<Event, ?> next = start.next("next");
-{% endhighlight %}
-            </td>
-        </tr>
-        <tr>
-            <td><strong>FollowedBy</strong></td>
-            <td>
-                <p>Appends a new pattern state. Other events can occur between a matching event and the previous matching event:</p>
-{% highlight java %}
-Pattern<Event, ?> next = start.followedBy("next");
-{% endhighlight %}
-            </td>
-        </tr>
-        <tr>
-            <td><strong>Where</strong></td>
-            <td>
-                <p>Defines a filter condition for the current pattern state. Only if an event passes the filter, it can match the state:</p>
-{% highlight java %}
-patternState.where(new FilterFunction<Event>() {
-    @Override
-    public boolean filter(Event value) throws Exception {
-        return ... // some condition
-    }
-});
-{% endhighlight %}
-            </td>
-        </tr>
-       <tr>
-           <td><strong>Subtype</strong></td>
-           <td>
-               <p>Defines a subtype condition for the current pattern state. Only if an event is of this subtype, it can match the state:</p>
-{% highlight java %}
-patternState.subtype(SubEvent.class);
-{% endhighlight %}
-           </td>
-       </tr>
-       <tr>
-          <td><strong>Within</strong></td>
-          <td>
-              <p>Defines the maximum time interval for an event sequence to match the pattern. If a non-completed event sequence exceeds this time, it is discarded:</p>
-{% highlight java %}
-patternState.within(Time.seconds(10));
-{% endhighlight %}
-          </td>
-      </tr>
-  </tbody>
-</table>
-
-### Detecting Patterns
-
-In order to run a stream of events against your pattern, you have to create a `PatternStream`.
-Given an input stream `input` and a pattern `pattern`, you create the `PatternStream` by calling
-
-{% highlight java %}
-DataStream<Event> input = ...
-Pattern<Event, ?> pattern = ...
-
-PatternStream<Event> patternStream = CEP.from(input, pattern);
-{% endhighlight %}
-
-### Selecting from Patterns
-
-Once you have obtained a `PatternStream` you can select from detected event sequences via the `select` or `flatSelect` methods.
-The `select` method requires a `PatternSelectFunction` implementation.
-A `PatternSelectFunction` has a `select` method which is called for each matching event sequence.
-It receives a map of string/event pairs of the matched events.
-The string is defined by the name of the state to which the event has been matched.
-The `select` method can return exactly one result.
-
-{% highlight java %}
-class MyPatternSelectFunction<IN, OUT> implements PatternSelectFunction<IN, OUT> {
-    @Override
-    public OUT select(Map<String, IN> pattern) {
-        IN startEvent = pattern.get("start");
-        IN endEvent = pattern.get("end");
-        
-        return new OUT(startEvent, endEvent);
-    }
-}
-{% endhighlight %}
-
-A `PatternFlatSelectFunction` is similar to the `PatternSelectFunction`, with the only distinction that it can return an arbitrary number of results.
-In order to do this, the `select` method has an additional `Collector` parameter which is used for the element output.
-
-{% highlight java %}
-class MyPatternFlatSelectFunction<IN, OUT> implements PatternFlatSelectFunction<IN, OUT> {
-    @Override
-    public void select(Map<String, IN> pattern, Collector<OUT> collector) {
-        IN startEvent = pattern.get("start");
-        IN endEvent = pattern.get("end");
-        
-        for (int i = 0; i < startEvent.getValue(); i++ ) {
-            collector.collect(new OUT(startEvent, endEvent));
-        }
-    }
-}
-{% endhighlight %}
-
-## Examples
-
-The following example detects the pattern `start, middle(name = "error") -> end(name = "critical")` on a keyed data stream of `Events`.
-The events are keyed by their ids and a valid pattern has to occur within 10 seconds.
-The whole processing is done with event time.
-
-{% highlight java %}
-StreamExecutionEnvironment env = ...
-env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
-
-DataStream<Event> input = ...
-
-DataStream<Event> partitionedInput = input.keyBy(new KeySelector<Event, Integer>() {
-	@Override
-	public Integer getKey(Event value) throws Exception {
-		return value.getId();
-	}
-});
-
-Pattern<Event, ?> pattern = Pattern.<Event>begin("start")
-	.next("middle").where(new FilterFunction<Event>() {
-		@Override
-		public boolean filter(Event value) throws Exception {
-			return value.getName().equals("name");
-		}
-	}).followedBy("end").where(new FilterFunction<Event>() {
-		@Override
-		public boolean filter(Event value) throws Exception {
-			return value.getName().equals("critical");
-		}
-	}).within(Time.seconds(10));
-
-PatternStream<Event> patternStream = CEP.from(partitionedInput, pattern);
-
-DataStream<Alert> alerts = patternStream.select(new PatternSelectFunction<Event, Alert>() {
-	@Override
-	public Alert select(Map<String, Event> pattern) throws Exception {
-		return new Alert(pattern.get("start"), pattern.get("end"))
-	}
-});
-{% endhighlight %}
+The *CEP guide* has been moved. Redirecting to [{{ site.baseurl }}/apis/streaming/libs/cep.html]({{ site.baseurl }}/apis/streaming/libs/cep.html) in 1 second.

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/LICENSE.txt
----------------------------------------------------------------------
diff --git a/docs/libs/fig/LICENSE.txt b/docs/libs/fig/LICENSE.txt
deleted file mode 100644
index 35b8673..0000000
--- a/docs/libs/fig/LICENSE.txt
+++ /dev/null
@@ -1,17 +0,0 @@
-All image files in the folder and its subfolders are
-licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-example-graph.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-example-graph.png b/docs/libs/fig/gelly-example-graph.png
deleted file mode 100644
index abef960..0000000
Binary files a/docs/libs/fig/gelly-example-graph.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-filter.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-filter.png b/docs/libs/fig/gelly-filter.png
deleted file mode 100644
index cb09744..0000000
Binary files a/docs/libs/fig/gelly-filter.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-gsa-sssp-result.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-gsa-sssp-result.png b/docs/libs/fig/gelly-gsa-sssp-result.png
deleted file mode 100644
index 6ae74dd..0000000
Binary files a/docs/libs/fig/gelly-gsa-sssp-result.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-gsa-sssp1.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-gsa-sssp1.png b/docs/libs/fig/gelly-gsa-sssp1.png
deleted file mode 100644
index 1141e14..0000000
Binary files a/docs/libs/fig/gelly-gsa-sssp1.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-gsa-sssp2.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-gsa-sssp2.png b/docs/libs/fig/gelly-gsa-sssp2.png
deleted file mode 100644
index edf19b8..0000000
Binary files a/docs/libs/fig/gelly-gsa-sssp2.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-reduceOnEdges.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-reduceOnEdges.png b/docs/libs/fig/gelly-reduceOnEdges.png
deleted file mode 100644
index ffb674d..0000000
Binary files a/docs/libs/fig/gelly-reduceOnEdges.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-reduceOnNeighbors.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-reduceOnNeighbors.png b/docs/libs/fig/gelly-reduceOnNeighbors.png
deleted file mode 100644
index 63137b8..0000000
Binary files a/docs/libs/fig/gelly-reduceOnNeighbors.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-union.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-union.png b/docs/libs/fig/gelly-union.png
deleted file mode 100644
index b00f831..0000000
Binary files a/docs/libs/fig/gelly-union.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-vc-sssp1.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-vc-sssp1.png b/docs/libs/fig/gelly-vc-sssp1.png
deleted file mode 100644
index 5feab62..0000000
Binary files a/docs/libs/fig/gelly-vc-sssp1.png and /dev/null differ

http://git-wip-us.apache.org/repos/asf/flink/blob/35ec26cd/docs/libs/fig/gelly-vc-sssp2.png
----------------------------------------------------------------------
diff --git a/docs/libs/fig/gelly-vc-sssp2.png b/docs/libs/fig/gelly-vc-sssp2.png
deleted file mode 100644
index 67976b3..0000000
Binary files a/docs/libs/fig/gelly-vc-sssp2.png and /dev/null differ


Mime
View raw message