From commits-return-5525-archive-asf-public=cust-asf.ponee.io@zeppelin.apache.org Sun Sep 29 07:08:28 2019 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id 33BF11806C4 for ; Sun, 29 Sep 2019 09:08:27 +0200 (CEST) Received: (qmail 9982 invoked by uid 500); 29 Sep 2019 07:08:23 -0000 Mailing-List: contact commits-help@zeppelin.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@zeppelin.apache.org Delivered-To: mailing list commits@zeppelin.apache.org Received: (qmail 9438 invoked by uid 99); 29 Sep 2019 07:08:21 -0000 Received: from Unknown (HELO svn01-us-west.apache.org) (209.188.14.144) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 29 Sep 2019 07:08:21 +0000 Received: from svn01-us-west.apache.org (localhost [127.0.0.1]) by svn01-us-west.apache.org (ASF Mail Server at svn01-us-west.apache.org) with ESMTP id 46C263A39F6 for ; Sun, 29 Sep 2019 07:08:18 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r1867691 [25/41] - in /zeppelin/site: docs/0.8.2/ docs/0.8.2/assets/ docs/0.8.2/assets/themes/ docs/0.8.2/assets/themes/zeppelin/ docs/0.8.2/assets/themes/zeppelin/bootstrap/ docs/0.8.2/assets/themes/zeppelin/bootstrap/css/ docs/0.8.2/asset... Date: Sun, 29 Sep 2019 07:08:15 -0000 To: commits@zeppelin.apache.org From: zjffdu@apache.org X-Mailer: svnmailer-1.0.9 Message-Id: <20190929070818.46C263A39F6@svn01-us-west.apache.org> Propchange: zeppelin/site/docs/0.8.2/search_data.json ------------------------------------------------------------------------------ svn:executable = * Added: zeppelin/site/docs/0.8.2/setup/basics/how_to_build.html URL: http://svn.apache.org/viewvc/zeppelin/site/docs/0.8.2/setup/basics/how_to_build.html?rev=1867691&view=auto ============================================================================== --- zeppelin/site/docs/0.8.2/setup/basics/how_to_build.html (added) +++ zeppelin/site/docs/0.8.2/setup/basics/how_to_build.html Sun Sep 29 07:08:10 2019 @@ -0,0 +1,562 @@ + + + + + + Apache Zeppelin 0.8.0 Documentation: How to Build Zeppelin from source + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+
+ + +

How to Build Zeppelin from Source

+ +
+ +

0. Requirements

+ +

If you want to build from source, you must first install the following dependencies:

+ + + + + + + + + + + + + + + + + + +
NameValue
Git(Any Version)
Maven3.1.x or higher
JDK1.7
+ +

If you haven't installed Git and Maven yet, check the Build requirements section and follow the step by step instructions from there.

+ +

1. Clone the Apache Zeppelin repository

+
git clone https://github.com/apache/zeppelin.git
+
+

2. Build source

+ +

You can build Zeppelin with following maven command:

+
mvn clean package -DskipTests [Options]
+
+

If you're unsure about the options, use the same commands that creates official binary package.

+
# update all pom.xml to use scala 2.11
+./dev/change_scala_version.sh 2.11
+# build zeppelin with all interpreters and include latest version of Apache spark support for local mode.
+mvn clean package -DskipTests -Pspark-2.0 -Phadoop-2.4 -Pr -Pscala-2.11
+
+

3. Done

+ +

You can directly start Zeppelin by running after successful build:

+
./bin/zeppelin-daemon.sh start
+
+

Check build-profiles section for further build options. +If you are behind proxy, follow instructions in Proxy setting section.

+ +

If you're interested in contribution, please check Contributing to Apache Zeppelin (Code) and Contributing to Apache Zeppelin (Website).

+ +

Build profiles

+ +

Spark Interpreter

+ +

To build with a specific Spark version, Hadoop version or specific features, define one or more of the following profiles and options:

+ +
-Pspark-[version]
+ +

Set spark major version

+ +

Available profiles are

+
-Pspark-2.1
+-Pspark-2.0
+-Pspark-1.6
+-Pspark-1.5
+-Pspark-1.4
+-Pcassandra-spark-1.5
+-Pcassandra-spark-1.4
+-Pcassandra-spark-1.3
+-Pcassandra-spark-1.2
+-Pcassandra-spark-1.1
+
+

minor version can be adjusted by -Dspark.version=x.x.x

+ +
-Phadoop-[version]
+ +

set hadoop major version

+ +

Available profiles are

+
-Phadoop-0.23
+-Phadoop-1
+-Phadoop-2.2
+-Phadoop-2.3
+-Phadoop-2.4
+-Phadoop-2.6
+-Phadoop-2.7
+
+

minor version can be adjusted by -Dhadoop.version=x.x.x

+ +
-Pscala-[version] (optional)
+ +

set scala version (default 2.10) +Available profiles are

+
-Pscala-2.10
+-Pscala-2.11
+
+
-Pr (optional)
+ +

enable R support with SparkR integration.

+ +
-Pvendor-repo (optional)
+ +

enable 3rd party vendor repository (cloudera)

+ +
-Pmapr[version] (optional)
+ +

For the MapR Hadoop Distribution, these profiles will handle the Hadoop version. As MapR allows different versions of Spark to be installed, you should specify which version of Spark is installed on the cluster by adding a Spark profile (-Pspark-1.6, -Pspark-2.0, etc.) as needed. +The correct Maven artifacts can be found for every version of MapR at http://doc.mapr.com

+ +

Available profiles are

+
-Pmapr3
+-Pmapr40
+-Pmapr41
+-Pmapr50
+-Pmapr51
+
+

-Pexamples (optional)

+ +

Build examples under zeppelin-examples directory

+ +

Build command examples

+ +

Here are some examples with several options:

+
# build with spark-2.1, scala-2.11
+./dev/change_scala_version.sh 2.11
+mvn clean package -Pspark-2.1 -Phadoop-2.4 -Pscala-2.11 -DskipTests
+
+# build with spark-2.0, scala-2.11
+./dev/change_scala_version.sh 2.11
+mvn clean package -Pspark-2.0 -Phadoop-2.4 -Pscala-2.11 -DskipTests
+
+# build with spark-1.6, scala-2.10
+mvn clean package -Pspark-1.6 -Phadoop-2.4 -DskipTests
+
+# spark-cassandra integration
+mvn clean package -Pcassandra-spark-1.5 -Dhadoop.version=2.6.0 -Phadoop-2.6 -DskipTests -DskipTests
+
+# with CDH
+mvn clean package -Pspark-1.5 -Dhadoop.version=2.6.0-cdh5.5.0 -Phadoop-2.6 -Pvendor-repo -DskipTests
+
+# with MapR
+mvn clean package -Pspark-1.5 -Pmapr50 -DskipTests
+
+

Ignite Interpreter

+
mvn clean package -Dignite.version=1.9.0 -DskipTests
+
+

Scalding Interpreter

+
mvn clean package -Pscalding -DskipTests
+
+

Optional configurations

+ +

Here are additional configurations that could be optionally tuned using the trailing -D option for maven commands

+ +

Spark package

+
spark.archive # default spark-${spark.version}
+spark.src.download.url # default http://d3kbcqa49mib13.cloudfront.net/${spark.archive}.tgz
+spark.bin.download.url # default http://d3kbcqa49mib13.cloudfront.net/${spark.archive}-bin-without-hadoop.tgz
+
+

Py4J package

+
python.py4j.version # default 0.9.2
+pypi.repo.url # default https://pypi.python.org/packages
+python.py4j.repo.folder # default /64/5c/01e13b68e8caafece40d549f232c9b5677ad1016071a48d04cc3895acaa3
+
+

final URL location for Py4J package will be produced as following:

+ +

${pypi.repo.url}${python.py4j.repo.folder}py4j-${python.py4j.version}.zip

+ +

Frontend Maven Plugin configurations

+
plugin.frontend.nodeDownloadRoot # default https://nodejs.org/dist/
+plugin.frontend.npmDownloadRoot # default http://registry.npmjs.org/npm/-/
+plugin.frontend.yarnDownloadRoot # default https://github.com/yarnpkg/yarn/releases/download/
+
+

Build requirements

+ +

Install requirements

+ +

If you don't have requirements prepared, install it. +(The installation method may vary according to your environment, example is for Ubuntu.)

+
sudo apt-get update
+sudo apt-get install git
+sudo apt-get install openjdk-7-jdk
+sudo apt-get install npm
+sudo apt-get install libfontconfig
+sudo apt-get install r-base-dev
+sudo apt-get install r-cran-evaluate
+
+

Install maven

+
wget http://www.eu.apache.org/dist/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
+sudo tar -zxf apache-maven-3.3.9-bin.tar.gz -C /usr/local/
+sudo ln -s /usr/local/apache-maven-3.3.9/bin/mvn /usr/local/bin/mvn
+
+

Notes: + - Ensure node is installed by running node --version
+ - Ensure maven is running version 3.1.x or higher with mvn -version + - Configure maven to use more memory than usual by export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=1024m"

+ +

Proxy setting (optional)

+ +

If you're behind the proxy, you'll need to configure maven and npm to pass through it.

+ +

First of all, configure maven in your ~/.m2/settings.xml.

+
<settings>
+  <proxies>
+    <proxy>
+      <id>proxy-http</id>
+      <active>true</active>
+      <protocol>http</protocol>
+      <host>localhost</host>
+      <port>3128</port>
+      <!-- <username>usr</username>
+      <password>pwd</password> -->
+      <nonProxyHosts>localhost|127.0.0.1</nonProxyHosts>
+    </proxy>
+    <proxy>
+      <id>proxy-https</id>
+      <active>true</active>
+      <protocol>https</protocol>
+      <host>localhost</host>
+      <port>3128</port>
+      <!-- <username>usr</username>
+      <password>pwd</password> -->
+      <nonProxyHosts>localhost|127.0.0.1</nonProxyHosts>
+    </proxy>
+  </proxies>
+</settings>
+
+

Then, next commands will configure npm.

+
npm config set proxy http://localhost:3128
+npm config set https-proxy http://localhost:3128
+npm config set registry "http://registry.npmjs.org/"
+npm config set strict-ssl false
+
+

Configure git as well

+
git config --global http.proxy http://localhost:3128
+git config --global https.proxy http://localhost:3128
+git config --global url."http://".insteadOf git://
+
+

To clean up, set active false in Maven settings.xml and run these commands.

+
npm config rm proxy
+npm config rm https-proxy
+git config --global --unset http.proxy
+git config --global --unset https.proxy
+git config --global --unset url."http://".insteadOf
+
+

Notes: + - If you are behind NTLM proxy you can use Cntlm Authentication Proxy. + - Replace localhost:3128 with the standard pattern http://user:pwd@host:port.

+ +

Package

+ +

To package the final distribution including the compressed archive, run:

+
mvn clean package -Pbuild-distr
+
+

To build a distribution with specific profiles, run:

+
mvn clean package -Pbuild-distr -Pspark-1.5 -Phadoop-2.4
+
+

The profiles -Pspark-1.5 -Phadoop-2.4 can be adjusted if you wish to build to a specific spark versions.

+ +

The archive is generated under zeppelin-distribution/target directory

+ +

Run end-to-end tests

+ +

Zeppelin comes with a set of end-to-end acceptance tests driving headless selenium browser

+
# assumes zeppelin-server running on localhost:8080 (use -Durl=.. to override)
+mvn verify
+
+# or take care of starting/stoping zeppelin-server from packaged zeppelin-distribuion/target
+mvn verify -P using-packaged-distr
+
+

Analytics

+ +
+
+ + +
+
+ +
+
+ + + + + + + + + + + Propchange: zeppelin/site/docs/0.8.2/setup/basics/how_to_build.html ------------------------------------------------------------------------------ svn:executable = * Added: zeppelin/site/docs/0.8.2/setup/basics/multi_user_support.html URL: http://svn.apache.org/viewvc/zeppelin/site/docs/0.8.2/setup/basics/multi_user_support.html?rev=1867691&view=auto ============================================================================== --- zeppelin/site/docs/0.8.2/setup/basics/multi_user_support.html (added) +++ zeppelin/site/docs/0.8.2/setup/basics/multi_user_support.html Sun Sep 29 07:08:10 2019 @@ -0,0 +1,303 @@ + + + + + + Apache Zeppelin 0.8.0 Documentation: Multi-user Support + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+
+ + +

Multi-user Support

+ +
+ +

This page describes about multi-user support.

+ + + +
+
+ + +
+
+ +
+
+ + + + + + + + + + + Propchange: zeppelin/site/docs/0.8.2/setup/basics/multi_user_support.html ------------------------------------------------------------------------------ svn:executable = * Added: zeppelin/site/docs/0.8.2/setup/deployment/cdh.html URL: http://svn.apache.org/viewvc/zeppelin/site/docs/0.8.2/setup/deployment/cdh.html?rev=1867691&view=auto ============================================================================== --- zeppelin/site/docs/0.8.2/setup/deployment/cdh.html (added) +++ zeppelin/site/docs/0.8.2/setup/deployment/cdh.html Sun Sep 29 07:08:10 2019 @@ -0,0 +1,360 @@ + + + + + + Apache Zeppelin 0.8.0 Documentation: Apache Zeppelin on CDH + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+ + + +
+
+ + +

Apache Zeppelin on CDH

+ +
+ +

1. Import Cloudera QuickStart Docker image

+ +
+

Cloudera has officially provided CDH Docker Hub in their own container. Please check this guide page for more information.

+
+ +

You can import the Docker image by pulling it from Cloudera Docker Hub.

+
docker pull cloudera/quickstart:latest
+
+

2. Run docker

+
docker run -it \
+ -p 80:80 \
+ -p 4040:4040 \
+ -p 8020:8020 \
+ -p 8022:8022 \
+ -p 8030:8030 \
+ -p 8032:8032 \
+ -p 8033:8033 \
+ -p 8040:8040 \
+ -p 8042:8042 \
+ -p 8088:8088 \
+ -p 8480:8480 \
+ -p 8485:8485 \
+ -p 8888:8888 \
+ -p 9083:9083 \
+ -p 10020:10020 \
+ -p 10033:10033 \
+ -p 18088:18088 \
+ -p 19888:19888 \
+ -p 25000:25000 \
+ -p 25010:25010 \
+ -p 25020:25020 \
+ -p 50010:50010 \
+ -p 50020:50020 \
+ -p 50070:50070 \
+ -p 50075:50075 \
+ -h quickstart.cloudera --privileged=true \
+ agitated_payne_backup /usr/bin/docker-quickstart;
+
+

3. Verify running CDH

+ +

To verify the application is running well, check the web UI for HDFS on http://<hostname>:50070/ and YARN on http://<hostname>:8088/cluster.

+ +

4. Configure Spark interpreter in Zeppelin

+ +

Set following configurations to conf/zeppelin-env.sh.

+
export MASTER=yarn-client
+export HADOOP_CONF_DIR=[your_hadoop_conf_path]
+export SPARK_HOME=[your_spark_home_path]
+
+

HADOOP_CONF_DIR(Hadoop configuration path) is defined in /scripts/docker/spark-cluster-managers/cdh/hdfs_conf.

+ +

Don't forget to set Spark master as yarn-client in Zeppelin Interpreters setting page like below.

+ +

+ +

5. Run Zeppelin with Spark interpreter

+ +

After running a single paragraph with Spark interpreter in Zeppelin,

+ +

+ +


+ +

browse http://<hostname>:8088/cluster/apps to check Zeppelin application is running well or not.

+ +

+ +
+
+ + +
+
+ +
+
+ + + + + + + + + + + Propchange: zeppelin/site/docs/0.8.2/setup/deployment/cdh.html ------------------------------------------------------------------------------ svn:executable = *