ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vkulichenko <valentin.kuliche...@gmail.com>
Subject Re: OutOfMemoryError with Hadoop backing filesystem
Date Fri, 06 Nov 2015 00:07:07 GMT

Can you please subscribe to the mailing list so that we receive email
notification for your posts? You can refer to this instruction:

Stolidedog wrote
> I get a OutOfMemoryError when I have multiple threads writing small files
> to IGFS with Hadoop configured as a backing file system.

How much data do you have? Most likely you just need to increase the amount
of heap memory (use -Xmx JVM property). You can also limit the amount of
memory allocated for the file system:

<bean class="org.apache.ignite.configuration.FileSystemConfiguration">
    <property name="maxSpaceSize" value="#{4 * 1024 * 1024 * 1024}"/>

Note that the limit should be around 1-2G less than maximum heap size,
because some memory is required by the node itself to be functional.

Let us know if it helps.


View this message in context: http://apache-ignite-users.70518.x6.nabble.com/OutOfMemoryError-with-Hadoop-backing-filesystem-tp1854p1857.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

View raw message