spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Petr Bouska (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-19566) Error initializing SparkContext under a Windows SYSTEM user
Date Sun, 12 Feb 2017 19:34:41 GMT

     [ https://issues.apache.org/jira/browse/SPARK-19566?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Petr Bouska updated SPARK-19566:
--------------------------------
    Description: 
We use a SparkLauncher class in our application which is running in a WebSphere Application
Server (it is started as a service). When we try to submit an application to the Spark (running
in standalone mode without Hadoop) , we get this error:
ERROR SparkContext: Error initializing SparkContext.
Exception in thread "main" java.io.IOException: failure to login
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:824)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2828)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1452)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1425)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
	at com.ibm.el.expertise.spark.MatrixCompletionRunner.main(MatrixCompletionRunner.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
	at java.lang.reflect.Method.invoke(Method.java:620)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: javax.security.auth.login.LoginException: java.lang.ArrayIndexOutOfBoundsException:
Array index out of range: 3
	at com.ibm.security.auth.module.Win64System.getCurrent(Native Method)
	at com.ibm.security.auth.module.Win64System.<init>(Win64System.java:74)
	at com.ibm.security.auth.module.Win64LoginModule.login(Win64LoginModule.java:143)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
	at java.lang.reflect.Method.invoke(Method.java:620)
	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:781)
	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:215)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:706)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:704)
	at java.security.AccessController.doPrivileged(AccessController.java:456)
	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:703)
	at javax.security.auth.login.LoginContext.login(LoginContext.java:609)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2828)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1452)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1425)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
	at com.ibm.el.expertise.spark.MatrixCompletionRunner.main(MatrixCompletionRunner.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
	at java.lang.reflect.Method.invoke(Method.java:620)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:884)
	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:215)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:706)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:704)
	at java.security.AccessController.doPrivileged(AccessController.java:456)
	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:703)
	at javax.security.auth.login.LoginContext.login(LoginContext.java:609)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)
	... 25 more
17/02/12 19:57:17 INFO ShutdownHookManager: Shutdown hook called
17/02/12 19:57:17 INFO ShutdownHookManager: Deleting directory C:\Windows\Temp\spark-89696e71-a1bd-49f5-86e3-23f2f805f381
getting access token
  [getToken] got user access token
getting user info
  [getUser] Got TokenUser info
  [getUser] userName: SYSTEM, domainName = NT AUTHORITY
  [getUser] userSid: S-1-5-18
  [getUser] LookupAccountName error: 1332

Spark is using a SYSTEM account for logging even if we specify the environment variable SPARK_USER
in a batch file.


  was:
We use a SparkLauncher class in our application which is running in a WebSphere Application
Server (it is started as a service). When we try to submit an application to Spark, we get
this error:
ERROR SparkContext: Error initializing SparkContext.
Exception in thread "main" java.io.IOException: failure to login
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:824)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2828)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1452)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1425)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
	at com.ibm.el.expertise.spark.MatrixCompletionRunner.main(MatrixCompletionRunner.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
	at java.lang.reflect.Method.invoke(Method.java:620)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: javax.security.auth.login.LoginException: java.lang.ArrayIndexOutOfBoundsException:
Array index out of range: 3
	at com.ibm.security.auth.module.Win64System.getCurrent(Native Method)
	at com.ibm.security.auth.module.Win64System.<init>(Win64System.java:74)
	at com.ibm.security.auth.module.Win64LoginModule.login(Win64LoginModule.java:143)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
	at java.lang.reflect.Method.invoke(Method.java:620)
	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:781)
	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:215)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:706)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:704)
	at java.security.AccessController.doPrivileged(AccessController.java:456)
	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:703)
	at javax.security.auth.login.LoginContext.login(LoginContext.java:609)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2828)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1452)
	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1425)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
	at com.ibm.el.expertise.spark.MatrixCompletionRunner.main(MatrixCompletionRunner.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
	at java.lang.reflect.Method.invoke(Method.java:620)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:884)
	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:215)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:706)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:704)
	at java.security.AccessController.doPrivileged(AccessController.java:456)
	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:703)
	at javax.security.auth.login.LoginContext.login(LoginContext.java:609)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)
	... 25 more
17/02/12 19:57:17 INFO ShutdownHookManager: Shutdown hook called
17/02/12 19:57:17 INFO ShutdownHookManager: Deleting directory C:\Windows\Temp\spark-89696e71-a1bd-49f5-86e3-23f2f805f381
getting access token
  [getToken] got user access token
getting user info
  [getUser] Got TokenUser info
  [getUser] userName: SYSTEM, domainName = NT AUTHORITY
  [getUser] userSid: S-1-5-18
  [getUser] LookupAccountName error: 1332

Spark is using a SYSTEM account for logging even if we specify the environment variable SPARK_USER
in a batch file.



> Error initializing SparkContext under a Windows SYSTEM user
> -----------------------------------------------------------
>
>                 Key: SPARK-19566
>                 URL: https://issues.apache.org/jira/browse/SPARK-19566
>             Project: Spark
>          Issue Type: Bug
>          Components: Windows
>    Affects Versions: 2.1.0
>            Reporter: Petr Bouska
>
> We use a SparkLauncher class in our application which is running in a WebSphere Application
Server (it is started as a service). When we try to submit an application to the Spark (running
in standalone mode without Hadoop) , we get this error:
> ERROR SparkContext: Error initializing SparkContext.
> Exception in thread "main" java.io.IOException: failure to login
> 	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:824)
> 	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
> 	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
> 	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2828)
> 	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818)
> 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
> 	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
> 	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1452)
> 	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1425)
> 	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
> 	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
> 	at scala.collection.immutable.List.foreach(List.scala:381)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
> 	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
> 	at com.ibm.el.expertise.spark.MatrixCompletionRunner.main(MatrixCompletionRunner.java:46)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
> 	at java.lang.reflect.Method.invoke(Method.java:620)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: javax.security.auth.login.LoginException: java.lang.ArrayIndexOutOfBoundsException:
Array index out of range: 3
> 	at com.ibm.security.auth.module.Win64System.getCurrent(Native Method)
> 	at com.ibm.security.auth.module.Win64System.<init>(Win64System.java:74)
> 	at com.ibm.security.auth.module.Win64LoginModule.login(Win64LoginModule.java:143)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
> 	at java.lang.reflect.Method.invoke(Method.java:620)
> 	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:781)
> 	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:215)
> 	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:706)
> 	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:704)
> 	at java.security.AccessController.doPrivileged(AccessController.java:456)
> 	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:703)
> 	at javax.security.auth.login.LoginContext.login(LoginContext.java:609)
> 	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)
> 	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
> 	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
> 	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2828)
> 	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818)
> 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
> 	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
> 	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1452)
> 	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1425)
> 	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
> 	at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:470)
> 	at scala.collection.immutable.List.foreach(List.scala:381)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
> 	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
> 	at com.ibm.el.expertise.spark.MatrixCompletionRunner.main(MatrixCompletionRunner.java:46)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
> 	at java.lang.reflect.Method.invoke(Method.java:620)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:884)
> 	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:215)
> 	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:706)
> 	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:704)
> 	at java.security.AccessController.doPrivileged(AccessController.java:456)
> 	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:703)
> 	at javax.security.auth.login.LoginContext.login(LoginContext.java:609)
> 	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:799)
> 	... 25 more
> 17/02/12 19:57:17 INFO ShutdownHookManager: Shutdown hook called
> 17/02/12 19:57:17 INFO ShutdownHookManager: Deleting directory C:\Windows\Temp\spark-89696e71-a1bd-49f5-86e3-23f2f805f381
> getting access token
>   [getToken] got user access token
> getting user info
>   [getUser] Got TokenUser info
>   [getUser] userName: SYSTEM, domainName = NT AUTHORITY
>   [getUser] userSid: S-1-5-18
>   [getUser] LookupAccountName error: 1332
> Spark is using a SYSTEM account for logging even if we specify the environment variable
SPARK_USER in a batch file.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message