zeppelin-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From prabhjyotsi...@apache.org
Subject zeppelin git commit: ZEPPELIN-1242. Should set property SPARK_YARN_MODE and do login before creating any spark stuff
Date Sat, 30 Jul 2016 06:23:39 GMT
Repository: zeppelin
Updated Branches:
  refs/heads/master 848dbd030 -> 83e7da7aa


ZEPPELIN-1242. Should set property SPARK_YARN_MODE and do login before creating any spark
stuff

### What is this PR for?

We run zeppelin on spark when spark authentication is turned on, but got the following exception
```
 INFO [2016-07-28 00:35:32,845] ({pool-2-thread-2} Logging.scala[logInfo]:58) - Changing view
acls to: zeppelin
 INFO [2016-07-28 00:35:32,846] ({pool-2-thread-2} Logging.scala[logInfo]:58) - Changing modify
acls to: zeppelin
 INFO [2016-07-28 00:35:32,908] ({pool-1-thread-3} Logging.scala[logInfo]:58) - Changing view
acls to: zeppelin
 INFO [2016-07-28 00:35:32,908] ({pool-1-thread-3} Logging.scala[logInfo]:58) - Changing modify
acls to: zeppelin
ERROR [2016-07-28 00:35:32,909] ({pool-2-thread-2} Job.java[run]:189) - Job failed
java.lang.IllegalArgumentException: Error: a secret key must be specified via the spark.authenticate.secret
config
        at org.apache.spark.SecurityManager.generateSecretKey(SecurityManager.scala:397)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:219)
        at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:118)
        at org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:187)
        at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:217)
        at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:566)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
        at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
        at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
ERROR [2016-07-28 00:35:32,918] ({pool-1-thread-3} TThreadPoolServer.java[run]:296) - Error
occurred during processing of message.
java.lang.IllegalArgumentException: Error: a secret key must be specified via the spark.authenticate.secret
config
        at org.apache.spark.SecurityManager.generateSecretKey(SecurityManager.scala:397)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:219)
        at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:118)
        at org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:187)
        at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:217)
        at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:566)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
        at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:110)
        at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.getProgress(RemoteInterpreterServer.java:404)
        at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1509)
        at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:1494)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
```
The root cause is that we didn't set property SPARK_YARN_MODE just like spark-shell did.

The following code in Main.scala is what we miss in zeppelin spark interpreter. https://github.com/apache/spark/blob/branch-1.6/repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala
```
  def main(args: Array[String]) {
    if (getMaster == "yarn-client") System.setProperty("SPARK_YARN_MODE", "true")
```
Besides that, we need to do login before creating any spark stuff, otherwise will hit the
classnotfound issue http://mail-archives.apache.org/mod_mbox/zeppelin-users/201606.mbox/%3CCAH-=KK2SzsXX5zvfnuoLbB1753Tm196_bcB83NzzooedSbLpRQmail.gmail.com%3E.
The cause is that SecurityManager will add secretkey to credential, so before that we should
do login, otherwise Executor can not connect to HttpFileServer on driver correctly due to
the missing of secretkey.

### What type of PR is it?
[Bug Fix]

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-1215

### How should this be tested?
Test it on secured cluster manually.

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: Jeff Zhang <zjffdu@apache.org>

Closes #1237 from zjffdu/ZEPPELIN-1242 and squashes the following commits:

de7c529 [Jeff Zhang] ZEPPELIN-1242. Should set property SPARK_YARN_MODE and do login before
creating any spark stuff


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/83e7da7a
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/83e7da7a
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/83e7da7a

Branch: refs/heads/master
Commit: 83e7da7aa6d19bd8489e2af470d9856ebc270524
Parents: 848dbd0
Author: Jeff Zhang <zjffdu@apache.org>
Authored: Thu Jul 28 14:13:42 2016 +0800
Committer: Prabhjyot Singh <prabhjyotsingh@gmail.com>
Committed: Sat Jul 30 11:53:20 2016 +0530

----------------------------------------------------------------------
 .../apache/zeppelin/spark/SparkInterpreter.java    | 17 +++++++++++++++++
 1 file changed, 17 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/83e7da7a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
index f716f7f..29c322d 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
@@ -18,6 +18,7 @@
 package org.apache.zeppelin.spark;
 
 import java.io.File;
+import java.io.IOException;
 import java.io.PrintWriter;
 import java.lang.reflect.Constructor;
 import java.lang.reflect.Field;
@@ -32,6 +33,7 @@ import java.util.concurrent.atomic.AtomicInteger;
 
 import com.google.common.base.Joiner;
 
+import org.apache.hadoop.security.UserGroupInformation;
 import org.apache.spark.SparkConf;
 import org.apache.spark.SparkContext;
 import org.apache.spark.SparkEnv;
@@ -524,6 +526,21 @@ public class SparkInterpreter extends Interpreter {
 
   @Override
   public void open() {
+    // set properties and do login before creating any spark stuff for secured cluster
+    if (getProperty("master").equals("yarn-client")) {
+      System.setProperty("SPARK_YARN_MODE", "true");
+    }
+    if (getProperty().contains("spark.yarn.keytab") &&
+            getProperty().contains("spark.yarn.principal")) {
+      try {
+        String keytab = getProperty().getProperty("spark.yarn.keytab");
+        String principal = getProperty().getProperty("spark.yarn.principal");
+        UserGroupInformation.loginUserFromKeytab(principal, keytab);
+      } catch (IOException e) {
+        throw new RuntimeException("Can not pass kerberos authentication", e);
+      }
+    }
+
     conf = new SparkConf();
     URL[] urls = getClassloaderUrls();
 


Mime
View raw message