activemq-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jsalvo <jesus.sa...@communicator.com.au>
Subject Re: 4.0 Consumer OutOfMemoryError bug?
Date Fri, 16 Jun 2006 08:23:25 GMT

> With regards memory consumption; you are saying RAM usage grows as you
> produce messages. This makes sense since we tend to cache messages
> around so that they are in RAM when consumers start consuming them.
> For non-pesistent messaging all messages stay in RAM. So firstly are
> you using persistent messaging? (i.e. what is the delivery mode on the
> producer).

I was using PERSISTENT mode:

      Connection conn = this.factory.createConnection();
      Session session = conn.createSession( false, Session.AUTO_ACKNOWLEDGE
);
      MessageProducer qProducer = session.createProducer( this.myQueue);    
      qProducer.setDeliveryMode(DeliveryMode.PERSISTENT);

> Secondly the journal caches messages and acks around in RAM until a
> checkpoint to JDBC occurs (for persistent messaging) - so you should
> see RAM use go down after you've sent a bunch of persistent messags,
> then some time later the checkpoint happens.

This does not seem to be the case for me. Having said that, when I was
running all of these test, I was actually using kaha persistence:

    <persistenceAdapter>
      <kahaPersistentAdaptor dir = "activemq-data"/>
    </persistenceAdapter>

I'll try to see if it happens as well with the original config:

    <persistenceAdapter>
      <journaledJDBC journalLogFiles="5" dataDirectory="../activemq-data"/>
    </persistenceAdapter>



--
View this message in context: http://www.nabble.com/4.0-Consumer-OutOfMemoryError-bug--t1707655.html#a4896797
Sent from the ActiveMQ - User forum at Nabble.com.


Mime
View raw message