activemq-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nervousbadger <nervousbad...@googlemail.com>
Subject KahaDB - log f
Date Tue, 14 Sep 2010 17:25:39 GMT

Hi,

We have the following scenario:

Single broker with ~20 queues. Some queues have extremely high throughput
(thousands of messages per second produced and consumed). Other queues have
low throughput and messages may remain on these queues for a couple of hours
before being consumed. All messages are persistent.

We are running out of file handles for the activemq process - looking in the
kahadb directory we can see ~1000 db-xxx.log files. If I understand the
documentation correctly, then each individual log file cannot be removed
until there are no more references to the messages logged within.

I think this means that for each 32MB file, if there is a single message
logged in it that has been delivered to one of our slow throughput queues
then even though every other message logged in the file has been delivered,
the file cannot be deleted until the final message has been delivered.

If this is correct, is there any way around this (such as having the log
files be per queue)? Or will we have to increase the maximum log file
size/maximum number of open files?

Thanks,
Pete.
-- 
View this message in context: http://activemq.2283324.n4.nabble.com/KahaDB-log-f-tp2539334p2539334.html
Sent from the ActiveMQ - User mailing list archive at Nabble.com.

Mime
View raw message