zeppelin-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From b..@apache.org
Subject zeppelin git commit: [ZEPPELIN-1279] Spark on Mesos Docker.
Date Sat, 03 Sep 2016 02:41:47 GMT
Repository: zeppelin
Updated Branches:
  refs/heads/master 33ddc00c6 -> cee58aa03


[ZEPPELIN-1279] Spark on Mesos Docker.

### What is this PR for?
This PR is for the documentation of running zeppelin on production environments especially
spark on mesos via Docker.
Related issue is https://github.com/apache/zeppelin/pull/1227 and https://github.com/apache/zeppelin/pull/1318
and I got a lot of hints from https://github.com/sequenceiq/hadoop-docker.
Tested on ubuntu.

### What type of PR is it?
Documentation

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-1279

### How should this be tested?
You can refer to https://github.com/apache/zeppelin/blob/master/docs/README.md#build-documentation.

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: astroshim <hsshim@nflabs.com>
Author: AhyoungRyu <fbdkdud93@hanmail.net>
Author: HyungSung <hsshim@nflabs.com>

Closes #1389 from astroshim/ZEPPELIN-1279 and squashes the following commits:

974366a [HyungSung] Merge pull request #10 from AhyoungRyu/ZEPPELIN-1279-ahyoung
076fdba [AhyoungRyu] Change zeppelin_mesos_conf.png file
1cbe9d3 [astroshim] fix spark version and mesos
2b821b4 [astroshim] fix docs
159bafc [astroshim] fix anchor
d8c43b4 [astroshim] add navigation
c808350 [astroshim] add image file and doc
a3b0ded [astroshim] create dockerfile for mesos


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/cee58aa0
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/cee58aa0
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/cee58aa0

Branch: refs/heads/master
Commit: cee58aa0387951e94bee2904600ca89d731b97ff
Parents: 33ddc00
Author: astroshim <hsshim@nflabs.com>
Authored: Fri Sep 2 13:51:40 2016 +0900
Committer: Alexander Bezzubov <bzz@apache.org>
Committed: Sat Sep 3 11:41:40 2016 +0900

----------------------------------------------------------------------
 docs/_includes/themes/zeppelin/_navigation.html |   3 +-
 .../zeppelin/img/docs-img/mesos_frameworks.png  | Bin 0 -> 77506 bytes
 .../img/docs-img/zeppelin_mesos_conf.png        | Bin 0 -> 127941 bytes
 docs/index.md                                   |   3 +-
 docs/install/spark_cluster_mode.md              |  66 ++++++++++++++++++-
 .../spark_mesos/Dockerfile                      |  63 ++++++++++++++++++
 .../spark_mesos/entrypoint.sh                   |  48 ++++++++++++++
 7 files changed, 179 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/docs/_includes/themes/zeppelin/_navigation.html
----------------------------------------------------------------------
diff --git a/docs/_includes/themes/zeppelin/_navigation.html b/docs/_includes/themes/zeppelin/_navigation.html
index 4bbe64e..9bd9967 100644
--- a/docs/_includes/themes/zeppelin/_navigation.html
+++ b/docs/_includes/themes/zeppelin/_navigation.html
@@ -105,7 +105,8 @@
                 <li class="title"><span><b>Advanced</b><span></li>
                 <li><a href="{{BASE_PATH}}/install/virtual_machine.html">Zeppelin
on Vagrant VM</a></li>
                 <li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-standalone-mode">Zeppelin
on Spark Cluster Mode (Standalone)</a></li>
-                <li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-standalone-mode">Zeppelin
on Spark Cluster Mode (YARN)</a></li>
+                <li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-on-yarn-mode">Zeppelin
on Spark Cluster Mode (YARN)</a></li>
+                <li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-on-mesos-mode">Zeppelin
on Spark Cluster Mode (Mesos)</a></li>
                 <li role="separator" class="divider"></li>
                 <li class="title"><span><b>Contibute</b><span></li>
                 <li><a href="{{BASE_PATH}}/development/writingzeppelininterpreter.html">Writing
Zeppelin Interpreter</a></li>

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/docs/assets/themes/zeppelin/img/docs-img/mesos_frameworks.png
----------------------------------------------------------------------
diff --git a/docs/assets/themes/zeppelin/img/docs-img/mesos_frameworks.png b/docs/assets/themes/zeppelin/img/docs-img/mesos_frameworks.png
new file mode 100644
index 0000000..af42893
Binary files /dev/null and b/docs/assets/themes/zeppelin/img/docs-img/mesos_frameworks.png
differ

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/docs/assets/themes/zeppelin/img/docs-img/zeppelin_mesos_conf.png
----------------------------------------------------------------------
diff --git a/docs/assets/themes/zeppelin/img/docs-img/zeppelin_mesos_conf.png b/docs/assets/themes/zeppelin/img/docs-img/zeppelin_mesos_conf.png
new file mode 100644
index 0000000..b85a3da
Binary files /dev/null and b/docs/assets/themes/zeppelin/img/docs-img/zeppelin_mesos_conf.png
differ

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/docs/index.md
----------------------------------------------------------------------
diff --git a/docs/index.md b/docs/index.md
index bff5253..9e0752b 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -170,7 +170,8 @@ Join to our [Mailing list](https://zeppelin.apache.org/community.html)
and repor
 * Advanced
   * [Apache Zeppelin on Vagrant VM](./install/virtual_machine.html)
   * [Zeppelin on Spark Cluster Mode (Standalone via Docker)](./install/spark_cluster_mode.html#spark-standalone-mode)
-  * [Zeppelin on Spark Cluster Mode (YARN via Docker)](./install/spark_cluster_mode.html#spark-yarn-mode)
+  * [Zeppelin on Spark Cluster Mode (YARN via Docker)](./install/spark_cluster_mode.html#spark-on-yarn-mode)
+  * [Zeppelin on Spark Cluster Mode (Mesos via Docker)](./install/spark_cluster_mode.html#spark-on-mesos-mode)
 * Contribute
   * [Writing Zeppelin Interpreter](./development/writingzeppelininterpreter.html)
   * [Writing Zeppelin Application (Experimental)](./development/writingzeppelinapplication.html)

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/docs/install/spark_cluster_mode.md
----------------------------------------------------------------------
diff --git a/docs/install/spark_cluster_mode.md b/docs/install/spark_cluster_mode.md
index 47f688c..d4c864a 100644
--- a/docs/install/spark_cluster_mode.md
+++ b/docs/install/spark_cluster_mode.md
@@ -1,7 +1,7 @@
 ---
 layout: page
 title: "Apache Zeppelin on Spark cluster mode"
-description: "This document will guide you how you can build and configure the environment
on 3 types of Spark cluster manager with Apache Zeppelin using docker scripts."
+description: "This document will guide you how you can build and configure the environment
on 3 types of Spark cluster manager(Standalone, Hadoop Yarn, Apache Mesos) with Apache Zeppelin
using docker scripts."
 group: install
 ---
 <!--
@@ -113,7 +113,7 @@ docker run -it \
 
 ### 3. Verify running Spark on YARN.
 
-You can simply verify the processes of Spark and YARN is running well in Docker with below
command.
+You can simply verify the processes of Spark and YARN are running well in Docker with below
command.
 
 ```
 ps -ef
@@ -140,3 +140,65 @@ Don't forget to set Spark `master` as `yarn-client` in Zeppelin **Interpreters**
 After running a single paragraph with Spark interpreter in Zeppelin, browse `http://<hostname>:8088/cluster/apps`
and check Zeppelin application is running well or not.
 
 <img src="../assets/themes/zeppelin/img/docs-img/yarn_applications.png" />
+
+
+
+## Spark on Mesos mode
+You can simply set up [Spark on Mesos](http://spark.apache.org/docs/latest/running-on-mesos.html)
docker environment with below steps.
+
+
+### 1. Build Docker file
+
+```
+cd $ZEPPELIN_HOME/scripts/docker/spark-cluster-managers/spark_mesos
+docker build -t "spark_mesos" .
+```
+
+
+### 2. Run docker
+
+```
+docker run --net=host -it \
+-p 8080:8080 \
+-p 7077:7077 \
+-p 8888:8888 \
+-p 8081:8081 \
+-p 8082:8082 \
+-p 5050:5050 \
+-p 5051:5051 \
+-p 4040:4040 \
+-h sparkmaster \
+--name spark_mesos \
+spark_mesos bash;
+```
+
+### 3. Verify running Spark on Mesos.
+
+You can simply verify the processes of Spark and Mesos are running well in Docker with below
command.
+
+```
+ps -ef
+```
+
+You can also check each application web UI for Mesos on `http://<hostname>:5050/cluster`
and Spark on `http://<hostname>:8080/`.
+
+
+### 4. Configure Spark interpreter in Zeppelin
+
+```
+export MASTER=mesos://127.0.1.1:5050
+export MESOS_NATIVE_JAVA_LIBRARY=[PATH OF libmesos.so]
+export SPARK_HOME=[PATH OF SPARK HOME]
+```
+
+
+Don't forget to set Spark `master` as `mesos://127.0.1.1:5050` in Zeppelin **Interpreters**
setting page like below.
+
+<img src="../assets/themes/zeppelin/img/docs-img/zeppelin_mesos_conf.png" />
+
+
+### 5. Run Zeppelin with Spark interpreter
+After running a single paragraph with Spark interpreter in Zeppelin, browse `http://<hostname>:5050/#/frameworks`
and check Zeppelin application is running well or not.
+
+<img src="../assets/themes/zeppelin/img/docs-img/mesos_frameworks.png" />
+

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile
----------------------------------------------------------------------
diff --git a/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile b/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile
new file mode 100644
index 0000000..450afef
--- /dev/null
+++ b/scripts/docker/spark-cluster-managers/spark_mesos/Dockerfile
@@ -0,0 +1,63 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+FROM centos:centos6
+
+ENV SPARK_PROFILE 2.0
+ENV SPARK_VERSION 2.0.0
+ENV HADOOP_PROFILE 2.3
+ENV HADOOP_VERSION 2.3.0
+
+# Update the image with the latest packages
+RUN yum update -y; yum clean all
+
+# Get utils
+RUN yum install -y \
+wget \
+tar \
+curl \
+svn \
+cyrus-sasl-md5 \
+libevent2-devel \
+&& \
+yum clean all
+
+# Remove old jdk
+RUN yum remove java; yum remove jdk
+
+# install jdk7
+RUN yum install -y java-1.7.0-openjdk-devel
+ENV JAVA_HOME /usr/lib/jvm/java
+ENV PATH $PATH:$JAVA_HOME/bin
+
+# install spark
+RUN curl -s http://www.apache.org/dist/spark/spark-$SPARK_VERSION/spark-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE.tgz
| tar -xz -C /usr/local/
+RUN cd /usr/local && ln -s spark-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE spark
+
+# update boot script
+COPY entrypoint.sh /etc/entrypoint.sh
+RUN chown root.root /etc/entrypoint.sh
+RUN chmod 700 /etc/entrypoint.sh
+
+# install mesos
+RUN wget http://repos.mesosphere.com/el/6/x86_64/RPMS/mesos-1.0.0-2.0.89.centos65.x86_64.rpm
+RUN rpm -Uvh mesos-1.0.0-2.0.89.centos65.x86_64.rpm
+
+#spark
+EXPOSE 8080 7077 7072 8081 8082
+
+#mesos
+EXPOSE 5050 5051
+
+ENTRYPOINT ["/etc/entrypoint.sh"]

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/cee58aa0/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh
----------------------------------------------------------------------
diff --git a/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh b/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh
new file mode 100755
index 0000000..28d76bf
--- /dev/null
+++ b/scripts/docker/spark-cluster-managers/spark_mesos/entrypoint.sh
@@ -0,0 +1,48 @@
+#!/bin/bash
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+export SPARK_HOME=/usr/local/spark/
+export SPARK_MASTER_PORT=7077
+export SPARK_MASTER_WEBUI_PORT=8080
+export SPARK_WORKER_PORT=8888
+export SPARK_WORKER_WEBUI_PORT=8081
+export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_HOME/jre/lib/amd64/server/
+
+# spark configuration
+cp $SPARK_HOME/conf/spark-env.sh.template $SPARK_HOME/conf/spark-env.sh
+echo "export MESOS_NATIVE_JAVA_LIBRARY=/usr/lib/libmesos.so" >> $SPARK_HOME/conf/spark-env.sh
+
+cp $SPARK_HOME/conf/spark-defaults.conf.template $SPARK_HOME/conf/spark-defaults.conf
+echo "spark.master mesos://`hostname`:5050" >> $SPARK_HOME/conf/spark-defaults.conf
+echo "spark.mesos.executor.home /usr/local/spark" >> $SPARK_HOME/conf/spark-defaults.conf
+
+# run spark
+cd $SPARK_HOME/sbin
+./start-master.sh
+./start-slave.sh spark://`hostname`:$SPARK_MASTER_PORT
+
+# start mesos
+mesos-master --ip=0.0.0.0 --work_dir=/var/lib/mesos &> /var/log/mesos_master.log &
+mesos-slave --master=0.0.0.0:5050 --work_dir=/var/lib/mesos --launcher=posix &> /var/log/mesos_slave.log
&
+
+CMD=${1:-"exit 0"}
+if [[ "$CMD" == "-d" ]];
+then
+	service sshd stop
+	/usr/sbin/sshd -D -d
+else
+	/bin/bash -c "$*"
+fi


Mime
View raw message