Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id F363D200AE4 for ; Thu, 26 May 2016 03:25:14 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id F1A3A160A2E; Thu, 26 May 2016 01:25:14 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id C37F0160A29 for ; Thu, 26 May 2016 03:25:13 +0200 (CEST) Received: (qmail 72808 invoked by uid 500); 26 May 2016 01:25:13 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 72798 invoked by uid 99); 26 May 2016 01:25:12 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 May 2016 01:25:12 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id CB31B2C14F8 for ; Thu, 26 May 2016 01:25:12 +0000 (UTC) Date: Thu, 26 May 2016 01:25:12 +0000 (UTC) From: "Sean Owen (JIRA)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Resolved] (SPARK-15524) Strange issue starting spark-shell: screen block MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Thu, 26 May 2016 01:25:15 -0000 [ https://issues.apache.org/jira/browse/SPARK-15524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen resolved SPARK-15524. ------------------------------- Resolution: Not A Problem This is specific to your environment. Your config is specifying a wrong/host port somewhere for YARN/HDFS daemons or they aren't reachable. > Strange issue starting spark-shell: screen block > ------------------------------------------------ > > Key: SPARK-15524 > URL: https://issues.apache.org/jira/browse/SPARK-15524 > Project: Spark > Issue Type: Bug > Reporter: Posty > > When I start spark-shell with yarn Im getting a strange issue. The screen stops here: > http://i.stack.imgur.com/kmska.png > No erros, just this screen. Do you know know what issue can be? > The only two spark files I configured was spark-env.sh: > HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop > export SPARK_MASTER_IP=192.168.1.110 > export SPARK_LOCAL_IP=192.168.1.110 > export SPARK_EXECUTOR_MEMORY=4G > And slaves: > 192.168.1.110 > 192.168.1.111 > I wait for 30minutes and the screen get unbloked, But not it showing this error below, any ideais why can be? > 16/05/25 11:13:43 ERROR SparkContext: Error initializing SparkContext. > java.net.ConnectException: Call From master/192.168.1.110 to 0.0.0.0:8032 failed on connection exception: java.net.ConnectException: Connection refused; F or more details see: http://wiki.apache.org/hadoop/ConnectionRefused > at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source ) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791) > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731) > at org.apache.hadoop.ipc.Client.call(Client.java:1472) > at org.apache.hadoop.ipc.Client.call(Client.java:1399) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng ine.java:232) > at com.sun.proxy.$Proxy11.getNewApplication(Unknown Source) > at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPB ClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:217) > at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI nvocationHandler.java:187) > at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat ionHandler.java:102) > at com.sun.proxy.$Proxy12.getNewApplication(Unknown Source) > at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplicati on(YarnClientImpl.java:206) > at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplicati on(YarnClientImpl.java:214) > at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:13 2) > at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(Y arnClientSchedulerBackend.scala:57) > at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl. scala:144) > at org.apache.spark.SparkContext.(SparkContext.scala:530) > at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala: 1017) > at $line3.$read$$iwC$$iwC.(:15) > at $line3.$read$$iwC.(:24) > at $line3.$read.(:26) > at $line3.$read$.(:30) > at $line3.$read$.() > at $line3.$eval$.(:7) > at $line3.$eval$.() > at $line3.$eval.$print() > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala: 1065) > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala: 1346) > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840 ) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8 57) > at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca la:902) > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:125) > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:124) > at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) > at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop Init.scala:124) > at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) > at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s cala:159) > at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL oopInit.scala:108) > at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala: 64) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:945) > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass Loader.scala:135) > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr ocess(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) > at org.apache.spark.repl.Main$.main(Main.scala:31) > at org.apache.spark.repl.Main.main(Main.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub mit$$runMain(SparkSubmit.scala:731) > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18 1) > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Caused by: java.net.ConnectException: Connection refused > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717 ) > at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout .java:206) > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) > at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:6 07) > at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:70 5) > at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368) > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521) > at org.apache.hadoop.ipc.Client.call(Client.java:1438) > ... 64 more > 16/05/25 11:13:43 WARN MetricsSystem: Stopping a MetricsSystem that is not runni ng > java.net.ConnectException: Call From sgd34.dei.uc.pt/10.17.0.88 to 0.0.0.0:8032 failed on connection exception: java.net.ConnectException: Connection refused; F or more details see: http://wiki.apache.org/hadoop/ConnectionRefused > at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source ) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791) > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731) > at org.apache.hadoop.ipc.Client.call(Client.java:1472) > at org.apache.hadoop.ipc.Client.call(Client.java:1399) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng ine.java:232) > at com.sun.proxy.$Proxy11.getNewApplication(Unknown Source) > at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPB ClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:217) > at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI nvocationHandler.java:187) > at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat ionHandler.java:102) > at com.sun.proxy.$Proxy12.getNewApplication(Unknown Source) > at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplicati on(YarnClientImpl.java:206) > at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplicati on(YarnClientImpl.java:214) > at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:13 2) > at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(Y arnClientSchedulerBackend.scala:57) > at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl. scala:144) > at org.apache.spark.SparkContext.(SparkContext.scala:530) > at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala: 1017) > at $iwC$$iwC.(:15) > at $iwC.(:24) > at (:26) > at .(:30) > at .() > at .(:7) > at .() > at $print() > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala: 1065) > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala: 1346) > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840 ) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8 57) > at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca la:902) > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:125) > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:124) > at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) > at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop Init.scala:124) > at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) > at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s cala:159) > at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL oopInit.scala:108) > at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala: 64) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:945) > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass Loader.scala:135) > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr ocess(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) > at org.apache.spark.repl.Main$.main(Main.scala:31) > at org.apache.spark.repl.Main.main(Main.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub mit$$runMain(SparkSubmit.scala:731) > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18 1) > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Caused by: java.net.ConnectException: Connection refused > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717 ) > at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout .java:206) > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) > at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:6 07) > at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:70 5) > at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368) > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521) > at org.apache.hadoop.ipc.Client.call(Client.java:1438) > ... 64 more > java.lang.NullPointerException > at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala :1367) > at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct orAccessorImpl.java:62) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:10 28) > at $iwC$$iwC.(:15) > at $iwC.(:24) > at (:26) > at .(:30) > at .() > at .(:7) > at .() > at $print() > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala: 1065) > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala: 1346) > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840 ) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8 57) > at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca la:902) > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:132) > at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply (SparkILoopInit.scala:124) > at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) > at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop Init.scala:124) > at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) > at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s cala:159) > at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL oopInit.scala:108) > at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala: 64) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark ILoop$$process$1.apply(SparkILoop.scala:945) > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass Loader.scala:135) > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr ocess(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) > at org.apache.spark.repl.Main$.main(Main.scala:31) > at org.apache.spark.repl.Main.main(Main.scala) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:62) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub mit$$runMain(SparkSubmit.scala:731) > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18 1) > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > :16: error: not found: value sqlContext > import sqlContext.implicits._ > ^ > :16: error: not found: value sqlContext > import sqlContext.sql -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org