activemq-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hiram Chirino" <>
Subject Re: Consuming a group of messages in a single transaction
Date Thu, 27 Jul 2006 17:33:16 GMT
Hi Naaman,

Sending large streams in a transaction is not currently supported.  The main
reason we did not want to support that is because currently all in progress
transaction have to be buffered in memory on the broker.  Sending a large
stream could cause the broker OOM.  If we change this in the future
(possibly by swapping the in progress transaction to disk), then I expect we
could support transacted streams.

On 7/27/06, nlif <> wrote:
> Thanks James. I followed your advice and reworte my consumer as
> syncronous.
> This seems to be working quite well, but I do have two concerns:
> 1) In switching from async to sync consumers, I can no longer use Jencks
> as
> a container. This means I will need to implement some kind of
> thread-pooling
> myself, doesn't it? But I can still use Jencks for connection pooling, for
> both inbound and outbound, can't I?
> 2) In some cases, a group includes large files, for which I intended to
> use
> ActiveMQ streams. The problem is how to make the stream participate in the
> transaction (both on the producer and on the consumer side). That is, if
> my
> group includes 3 messages AND a file, then the consumer should only commit
> both the 3 messages and the file, or rollback for all of them. Is this at
> all possible? It seems to me like it isn't, since a stream API can't
> participate in a transaction. If so, is there another way for me to
> achieve
> both requirements: send several items as an atomic group (even across a
> cluster), AND send large files with low-memory footprint?
> Thanks,
> Naaman
> --
> View this message in context:
> Sent from the ActiveMQ - User forum at



  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message