ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Zhengqingzheng <zhengqingzh...@huawei.com>
Subject re: re: ignite problem when loading large amount of data into cache
Date Wed, 20 Apr 2016 06:39:08 GMT
Hi Val,
When the exception occurred, I checked the forum and reset the java vm size to 30g, and also
I split my table into 10 smaller tables, each contains 1g data.
In this case, I saw your suggestion on off-heap settings. I don't want to reload all the data
again, So I asked if it is possible to make the configuration take effect immediately at running

Btw, I see there are backup settings in partition mode,    like this  <property name="backups"
But I did not see where those backups are stored. Is there any settings like redis which will
automatically load dump files when redis recovered from an accident?


发件人: vkulichenko [mailto:valentin.kulichenko@gmail.com] 
发送时间: 2016年4月20日 12:42
收件人: user@ignite.apache.org
主题: Re: re: ignite problem when loading large amount of data into cache


Configuration of the existing cache can't be changed in runtime. The only option is to destroy
the cache and create it again with new parameters (you will lose all in-memory data, of course).
What's the use case when you can need this?


View this message in context: http://apache-ignite-users.70518.x6.nabble.com/ignite-problem-when-loading-large-amount-of-data-into-cache-tp4324p4347.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.
View raw message