activemq-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dmytro Karimov (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMQ-5235) erroneous temp percent used
Date Mon, 02 Feb 2015 15:44:34 GMT

    [ https://issues.apache.org/jira/browse/AMQ-5235?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14301384#comment-14301384
] 

Dmytro Karimov commented on AMQ-5235:
-------------------------------------

Hi, we also faced with this problem.

Maybe this can help, ActiveMQ writes the following log:
{code}
2015-02-02 14:56:12,829 | ERROR | Temporary Store limit is 102400 mb, whilst the temporary
data directory: /usr/local/activemq/data/amq/tmp_storage only has 6037 mb of usable space
- resetting to maximum available 6037 mb. | org.apache.activemq.broker.BrokerService | main
{code}
but directory _/usr/local/activemq/data/amq/tmp_storage_ isn't exist

also we use LevelDB, and folder size _activemq-data_ is 754184

> erroneous temp percent used
> ---------------------------
>
>                 Key: AMQ-5235
>                 URL: https://issues.apache.org/jira/browse/AMQ-5235
>             Project: ActiveMQ
>          Issue Type: Bug
>          Components: activemq-leveldb-store
>    Affects Versions: 5.9.0
>         Environment: debian (quality testing and production)
>            Reporter: anselme dewavrin
>
> Dear all,
> We have an activemq 5.9 configured with 1GB of tempUsage allowed. Just by security because
we only use persistent messages (about 6000 messages per day). After severall days of use,
the temp usage increases, and even shows values that are above the total amount of the data
on disk. Here it shows 45% of its 1GB limit for the following files :
> find activemq-data -ls
> 76809801    4 drwxr-xr-x   5 anselme  anselme      4096 Jun 19 10:24 activemq-data
> 76809813    4 -rw-r--r--   1 anselme  anselme        24 Jun 16 16:13 activemq-data/store-version.txt
> 76809817    4 drwxr-xr-x   2 anselme  anselme      4096 Jun 16 16:13 activemq-data/dirty.index
> 76809811    4 -rw-r--r--   2 anselme  anselme      2437 Jun 16 12:06 activemq-data/dirty.index/000008.sst
> 76809820    4 -rw-r--r--   1 anselme  anselme        16 Jun 16 16:13 activemq-data/dirty.index/CURRENT
> 76809819   80 -rw-r--r--   1 anselme  anselme     80313 Jun 16 16:13 activemq-data/dirty.index/000011.sst
> 76809822    0 -rw-r--r--   1 anselme  anselme         0 Jun 16 16:13 activemq-data/dirty.index/LOCK
> 76809810  300 -rw-r--r--   2 anselme  anselme    305206 Jun 16 11:51 activemq-data/dirty.index/000005.sst
> 76809821 2048 -rw-r--r--   1 anselme  anselme   2097152 Jun 19 11:30 activemq-data/dirty.index/000012.log
> 76809818 1024 -rw-r--r--   1 anselme  anselme   1048576 Jun 16 16:13 activemq-data/dirty.index/MANIFEST-000010
> 76809816    0 -rw-r--r--   1 anselme  anselme         0 Jun 16 16:13 activemq-data/lock
> 76809815 102400 -rw-r--r--   1 anselme  anselme  104857600 Jun 19 11:30 activemq-data/0000000000f0faaf.log
> 76809823 102400 -rw-r--r--   1 anselme  anselme  104857600 Jun 16 11:50 activemq-data/0000000000385f46.log
> 76809807    4 drwxr-xr-x   2 anselme  anselme      4096 Jun 16 16:13 activemq-data/0000000000f0faaf.index
> 76809808  420 -rw-r--r--   1 anselme  anselme    429264 Jun 16 16:13 activemq-data/0000000000f0faaf.index/000009.log
> 76809811    4 -rw-r--r--   2 anselme  anselme      2437 Jun 16 12:06 activemq-data/0000000000f0faaf.index/000008.sst
> 76809812    4 -rw-r--r--   1 anselme  anselme       165 Jun 16 16:13 activemq-data/0000000000f0faaf.index/MANIFEST-000007
> 76809809    4 -rw-r--r--   1 anselme  anselme        16 Jun 16 16:13 activemq-data/0000000000f0faaf.index/CURRENT
> 76809810  300 -rw-r--r--   2 anselme  anselme    305206 Jun 16 11:51 activemq-data/0000000000f0faaf.index/000005.sst
> 76809814 102400 -rw-r--r--   1 anselme  anselme  104857600 Jun 12 21:06 activemq-data/0000000000000000.log
> 76809802    4 drwxr-xr-x   2 anselme  anselme      4096 Jun 16 16:13 activemq-data/plist.index
> 76809803    4 -rw-r--r--   1 anselme  anselme        16 Jun 16 16:13 activemq-data/plist.index/CURRENT
> 76809806    0 -rw-r--r--   1 anselme  anselme         0 Jun 16 16:13 activemq-data/plist.index/LOCK
> 76809805 1024 -rw-r--r--   1 anselme  anselme   1048576 Jun 16 16:13 activemq-data/plist.index/000003.log
> 76809804 1024 -rw-r--r--   1 anselme  anselme   1048576 Jun 16 16:13 activemq-data/plist.index/MANIFEST-000002
> The problem is that in our production system it once blocked producers with a tempusage
at 122%, even if the disk was empty.
> So we invesigated and executed the broker in a debugger, and found how the usage is calculated.
If it in the scala leveldb files : It is not based on what  is on disk, but on what it thinks
is on the disk. It multiplies the size of one log by the number of logs known by a certain
hashmap.
> I think the entries of  the hashmap are not removed when the log files are purged.
> Could you confirm ?
> Thanks in advance 
> Anselme



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message