spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sro...@apache.org
Subject spark git commit: [DOCS] Fix unreachable links in the document
Date Tue, 12 Sep 2017 14:07:25 GMT
Repository: spark
Updated Branches:
  refs/heads/branch-2.2 10c68366e -> 63098dc31


[DOCS] Fix unreachable links in the document

## What changes were proposed in this pull request?

Recently, I found two unreachable links in the document and fixed them.
Because of small changes related to the document, I don't file this issue in JIRA but please
suggest I should do it if you think it's needed.

## How was this patch tested?

Tested manually.

Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>

Closes #19195 from sarutak/fix-unreachable-link.

(cherry picked from commit 957558235b7537c706c6ab4779655aa57838ebac)
Signed-off-by: Sean Owen <sowen@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/63098dc3
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/63098dc3
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/63098dc3

Branch: refs/heads/branch-2.2
Commit: 63098dc3170bf4289091d97b7beb63dd0e2356c5
Parents: 10c6836
Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>
Authored: Tue Sep 12 15:07:04 2017 +0100
Committer: Sean Owen <sowen@cloudera.com>
Committed: Tue Sep 12 15:07:21 2017 +0100

----------------------------------------------------------------------
 docs/building-spark.md        | 2 +-
 docs/rdd-programming-guide.md | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/63098dc3/docs/building-spark.md
----------------------------------------------------------------------
diff --git a/docs/building-spark.md b/docs/building-spark.md
index 777635a..14164f1 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -119,7 +119,7 @@ should run continuous compilation (i.e. wait for changes). However, this
has not
 extensively. A couple of gotchas to note:
 
 * it only scans the paths `src/main` and `src/test` (see
-[docs](http://scala-tools.org/mvnsites/maven-scala-plugin/usage_cc.html)), so it will only
work
+[docs](http://davidb.github.io/scala-maven-plugin/example_cc.html)), so it will only work
 from within certain submodules that have that structure.
 
 * you'll typically need to run `mvn install` from the project root for compilation within

http://git-wip-us.apache.org/repos/asf/spark/blob/63098dc3/docs/rdd-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/rdd-programming-guide.md b/docs/rdd-programming-guide.md
index 8e6c36b..e3a31b8 100644
--- a/docs/rdd-programming-guide.md
+++ b/docs/rdd-programming-guide.md
@@ -604,7 +604,7 @@ before the `reduce`, which would cause `lineLengths` to be saved in memory
after
 Spark's API relies heavily on passing functions in the driver program to run on the cluster.
 There are two recommended ways to do this:
 
-* [Anonymous function syntax](http://docs.scala-lang.org/tutorials/tour/anonymous-function-syntax.html),
+* [Anonymous function syntax](http://docs.scala-lang.org/tour/basics.html#functions),
   which can be used for short pieces of code.
 * Static methods in a global singleton object. For example, you can define `object MyFunctions`
and then
   pass `MyFunctions.func1`, as follows:


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message