spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shixiong Zhu (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-12784) Spark UI IndexOutOfBoundsException with dynamic allocation
Date Tue, 12 Jan 2016 21:58:39 GMT

    [ https://issues.apache.org/jira/browse/SPARK-12784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15095043#comment-15095043
] 

Shixiong Zhu edited comment on SPARK-12784 at 1/12/16 9:58 PM:
---------------------------------------------------------------

I think I found the root cause. Working on a fix.


was (Author: zsxwing):
I think I found the root cause. Working on a fix. Actually, similar issues exists in other
places and I will fix them as well.

> Spark UI IndexOutOfBoundsException with dynamic allocation
> ----------------------------------------------------------
>
>                 Key: SPARK-12784
>                 URL: https://issues.apache.org/jira/browse/SPARK-12784
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI, YARN
>    Affects Versions: 1.5.2
>            Reporter: Thomas Graves
>            Assignee: Shixiong Zhu
>
> Trying to load the web UI Executors page when using dynamic allocation running on yarn
can lead to an IndexOutOfBoundsException Exception.
> I'm assuming the number of executors is changing as its trying to be loaded which is
causing this as during this time it was letting executors go.
> HTTP ERROR 500
> Problem accessing /executors/. Reason:
>     Server Error
> Caused by:
> java.lang.IndexOutOfBoundsException: 1058
> 	at scala.collection.LinearSeqOptimized$class.apply(LinearSeqOptimized.scala:52)
> 	at scala.collection.immutable.Stream.apply(Stream.scala:185)
> 	at org.apache.spark.ui.exec.ExecutorsPage$.getExecInfo(ExecutorsPage.scala:180)
> 	at org.apache.spark.ui.exec.ExecutorsPage$$anonfun$11.apply(ExecutorsPage.scala:60)
> 	at org.apache.spark.ui.exec.ExecutorsPage$$anonfun$11.apply(ExecutorsPage.scala:59)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.immutable.Range.foreach(Range.scala:141)
> 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:105)
> 	at org.apache.spark.ui.exec.ExecutorsPage.render(ExecutorsPage.scala:59)
> 	at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
> 	at org.apache.spark.ui.WebUI$$anonfun$2.apply(WebUI.scala:79)
> 	at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:69)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
> 	at org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
> 	at org.spark-project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1496)
> 	at org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.doFilter(AmIpFilter.java:164)
> 	at org.spark-project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
> 	at org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:499)
> 	at org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
> 	at org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
> 	at org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
> 	at org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
> 	at org.spark-project.jetty.server.handler.GzipHandler.handle(GzipHandler.java:264)
> 	at org.spark-project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
> 	at org.spark-project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
> 	at org.spark-project.jetty.server.Server.handle(Server.java:370)
> 	at org.spark-project.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
> 	at org.spark-project.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
> 	at org.spark-project.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
> 	at org.spark-project.jetty.http.HttpParser.parseNext(HttpParser.java:644)
> 	at org.spark-project.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
> 	at org.spark-project.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
> 	at org.spark-project.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
> 	at org.spark-project.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
> 	at org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
> 	at org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
> 	at java.lang.Thread.run(Thread.java:745)
> Simply reloading eventually gets the ui to come up so its not a blocker but not a very
friendly experience either.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message