spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mubarak Seyed <spark.devu...@gmail.com>
Subject Re: OutOfMemoryError when running sbt/sbt test
Date Wed, 27 Aug 2014 00:58:08 GMT
What is your ulimit value?


On Tue, Aug 26, 2014 at 5:49 PM, jay vyas <jayunit100.apache@gmail.com>
wrote:

> Hi spark.
>
> I've been trying to build spark, but I've been getting lots of oome
> exceptions.
>
> https://gist.github.com/jayunit100/d424b6b825ce8517d68c
>
> For the most part, they are of the form:
>
> java.lang.OutOfMemoryError: unable to create new native thread
>
> I've attempted to hard code the "get_mem_opts" function, which is in the
> sbt-launch-lib.bash file, to
> have various very high parameter sizes (i.e. -Xms5g") with high
> MaxPermSize, etc... and to no avail.
>
> Any thoughts on this would be appreciated.
>
> I know of others having the same problem as well.
>
> Thanks!
>
> --
> jay vyas
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message