activemq-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "" <>
Subject Duplicate message detection on failover
Date Wed, 07 Mar 2018 15:43:02 GMT
We have ActiveMq 5.12 in following configuration:

PostgreSQL for persistance
two Nodes, one in Standby
JDBC Master Slave with shared Database
static cluster discovery
Everything seams to be fine, failover works as expected, but sometimes during failover (or
restart of whole cluster) we are observing following exception:

 WARN  [ActiveMQ NIO Worker 6] org.apache.activemq.transaction.LocalTransaction  - Store COMMIT Batch entry 2 INSERT INTO ACTIVEMQ_MSGS(ID, MSGID_PROD, MSGID_SEQ,
CONTAINER, EXPIRATION, PRIORITY, MSG, XID) VALUES (...) was aborted:  Unique-Constraint activemq_msgs_pkey
Detail: key(id)=(7095330) alerady exists
ActiveMQ propagates this exception directly to the client.

I thought, that ActiveMQ would be able to recognise duplicated message, but something goes
wrong here.....

The client tries to deliver message with already existing ID, should not ActiveMQ compare
this message to one already existing in storage (if possible, depending on DB) and if both
messages are the same just ignore second message?

Or maybe ActiveMQ assumes that duplicated messages are allowed to be persisted and our DB
structure is not correct (constraint on id)?

CREATE TABLE activemq_msgs
   id          bigint          NOT NULL,
   container   varchar(250),
   msgid_prod  varchar(250),
   msgid_seq   bigint,
   expiration  bigint,
   msg         bytea,
   priority    bigint,
   xid         varchar(250)

ALTER TABLE activemq_msgs
   ADD CONSTRAINT activemq_msgs_pkey
   PRIMARY KEY (id);
Should we drop activemq_msgs_pkey ?
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message