spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sro...@apache.org
Subject spark git commit: [SPARK-19106][DOCS] Styling for the configuration docs is broken
Date Sat, 07 Jan 2017 19:19:33 GMT
Repository: spark
Updated Branches:
  refs/heads/branch-2.1 86b66216d -> c95b58557


[SPARK-19106][DOCS] Styling for the configuration docs is broken

configuration.html section headings were not specified correctly in markdown and weren't rendering,
being recognized correctly. Removed extra p tags and pulled level 4 titles up to level 3,
since level 3 had been skipped. This improves the TOC.

Doc build, manual check.

Author: Sean Owen <sowen@cloudera.com>

Closes #16490 from srowen/SPARK-19106.

(cherry picked from commit 54138f6e89abfc17101b4f2812715784a2b98331)
Signed-off-by: Sean Owen <sowen@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c95b5855
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c95b5855
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c95b5855

Branch: refs/heads/branch-2.1
Commit: c95b58557dec2f4708d5efd9314edd80e0975fc8
Parents: 86b6621
Author: Sean Owen <sowen@cloudera.com>
Authored: Sat Jan 7 19:15:51 2017 +0000
Committer: Sean Owen <sowen@cloudera.com>
Committed: Sat Jan 7 19:19:30 2017 +0000

----------------------------------------------------------------------
 docs/configuration.md | 78 ++++++++++++++++++++++++++++------------------
 1 file changed, 47 insertions(+), 31 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/c95b5855/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index 9c325b6..7c51e13 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -59,6 +59,7 @@ The following format is accepted:
     1p or 1pb (pebibytes = 1024 tebibytes)
 
 ## Dynamically Loading Spark Properties
+
 In some cases, you may want to avoid hard-coding certain configurations in a `SparkConf`.
For
 instance, if you'd like to run the same application with different masters or different
 amounts of memory. Spark allows you to simply create an empty conf:
@@ -106,7 +107,8 @@ line will appear. For all other configuration properties, you can assume
the def
 Most of the properties that control internal settings have reasonable default values. Some
 of the most common options to set are:
 
-#### Application Properties
+### Application Properties
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -206,7 +208,8 @@ of the most common options to set are:
 
 Apart from these, the following properties are also available, and may be useful in some
situations:
 
-#### Runtime Environment
+### Runtime Environment
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -453,7 +456,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Shuffle Behavior
+### Shuffle Behavior
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -594,7 +598,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Spark UI
+### Spark UI
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -718,7 +723,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Compression and Serialization
+### Compression and Serialization
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -864,7 +870,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Memory Management
+### Memory Management
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -954,7 +961,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Execution Behavior
+### Execution Behavior
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1081,7 +1089,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Networking
+### Networking
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1112,13 +1121,13 @@ Apart from these, the following properties are also available, and
may be useful
   <td><code>spark.driver.bindAddress</code></td>
   <td>(value of spark.driver.host)</td>
   <td>
-    <p>Hostname or IP address where to bind listening sockets. This config overrides
the SPARK_LOCAL_IP
-    environment variable (see below).</p>
+    Hostname or IP address where to bind listening sockets. This config overrides the SPARK_LOCAL_IP
+    environment variable (see below).
 
-    <p>It also allows a different address from the local one to be advertised to executors
or external systems.
+    <br />It also allows a different address from the local one to be advertised to
executors or external systems.
     This is useful, for example, when running containers with bridged networking. For this
to properly work,
     the different ports used by the driver (RPC, block manager and UI) need to be forwarded
from the
-    container's host.</p>
+    container's host.
   </td>
 </tr>
 <tr>
@@ -1190,7 +1199,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Scheduling
+### Scheduling
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1410,7 +1420,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Dynamic Allocation
+### Dynamic Allocation
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1491,7 +1502,8 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Security
+### Security
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1647,7 +1659,7 @@ Apart from these, the following properties are also available, and may
be useful
 </tr>
 </table>
 
-#### Encryption
+### Encryption
 
 <table class="table">
     <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
@@ -1655,21 +1667,21 @@ Apart from these, the following properties are also available, and
may be useful
         <td><code>spark.ssl.enabled</code></td>
         <td>false</td>
         <td>
-            <p>Whether to enable SSL connections on all supported protocols.</p>
+            Whether to enable SSL connections on all supported protocols.
 
-            <p>When <code>spark.ssl.enabled</code> is configured, <code>spark.ssl.protocol</code>
-            is required.</p>
+            <br />When <code>spark.ssl.enabled</code> is configured, <code>spark.ssl.protocol</code>
+            is required.
 
-            <p>All the SSL settings like <code>spark.ssl.xxx</code> where
<code>xxx</code> is a
+            <br />All the SSL settings like <code>spark.ssl.xxx</code>
where <code>xxx</code> is a
             particular configuration property, denote the global configuration for all the
supported
             protocols. In order to override the global configuration for the particular protocol,
-            the properties must be overwritten in the protocol-specific namespace.</p>
+            the properties must be overwritten in the protocol-specific namespace.
 
-            <p>Use <code>spark.ssl.YYY.XXX</code> settings to overwrite
the global configuration for
+            <br />Use <code>spark.ssl.YYY.XXX</code> settings to overwrite
the global configuration for
             particular protocol denoted by <code>YYY</code>. Example values for
<code>YYY</code>
             include <code>fs</code>, <code>ui</code>, <code>standalone</code>,
and
             <code>historyServer</code>.  See <a href="security.html#ssl-configuration">SSL
-            Configuration</a> for details on hierarchical SSL configuration for services.</p>
+            Configuration</a> for details on hierarchical SSL configuration for services.
         </td>
     </tr>
     <tr>
@@ -1753,7 +1765,8 @@ Apart from these, the following properties are also available, and may
be useful
 </table>
 
 
-#### Spark SQL
+### Spark SQL
+
 Running the <code>SET -v</code> command will show the entire list of the SQL
configuration.
 
 <div class="codetabs">
@@ -1795,7 +1808,8 @@ showDF(properties, numRows = 200, truncate = FALSE)
 </div>
 
 
-#### Spark Streaming
+### Spark Streaming
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1916,7 +1930,8 @@ showDF(properties, numRows = 200, truncate = FALSE)
 </tr>
 </table>
 
-#### SparkR
+### SparkR
+
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
@@ -1965,7 +1980,7 @@ showDF(properties, numRows = 200, truncate = FALSE)
 
 </table>
 
-#### Deploy
+### Deploy
 
 <table class="table">
   <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
@@ -1988,15 +2003,16 @@ showDF(properties, numRows = 200, truncate = FALSE)
 </table>
 
 
-#### Cluster Managers
+### Cluster Managers
+
 Each cluster manager in Spark has additional configuration options. Configurations
 can be found on the pages for each mode:
 
-##### [YARN](running-on-yarn.html#configuration)
+#### [YARN](running-on-yarn.html#configuration)
 
-##### [Mesos](running-on-mesos.html#configuration)
+#### [Mesos](running-on-mesos.html#configuration)
 
-##### [Standalone Mode](spark-standalone.html#cluster-launch-scripts)
+#### [Standalone Mode](spark-standalone.html#cluster-launch-scripts)
 
 # Environment Variables
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message