activemq-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From alprausch77 <joachim.gl...@dematic.com>
Subject mKahaDB - no clean-up because of ACKs
Date Fri, 09 Mar 2018 09:51:02 GMT
Hello.
Recently we had a problem on a ActiveMQ 5.10 version (with manually applied
patch for AMQ-5542).
The mKahaDB data store increased to ~30 GB and couldn´t clean up the data
files anymore.

The log showed always something like this:
/not removing data file: 317633 as contained ack(s) refer to referenced
file: [317632, 317633]/

I´m aware that the data files can´t be cleaned up if there is a not consumed
message in a queue. But that´s not the case here.
I have started a ActiveMQ with the copied storage on my local machine and
checked every queue and topic via JConsole if there is any message in it -
but every queue/topic shows a size of 0.

So it seems to me that the messages are processed but just the ACK is
somewhat stuck in the store.

Is there a way to (manually) get rid of the ACKs?
Or is there a way to have a deeper analysis of the kahaDB storage files to
find the reason for the stucked ACKs?

I can provide the whole log with the KahaDB recovering if this is of any
help.

Thanks.
Joachim



--
Sent from: http://activemq.2283324.n4.nabble.com/ActiveMQ-User-f2341805.html

Mime
View raw message