jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Quanah Gibson-Mount <>
Subject stress testing with ramp-up and time periods (LDAP/SLAMD)
Date Wed, 10 Oct 2018 23:31:47 GMT

In the past I've used slamd to stress test various LDAP servers to 
determine what the max throughput is that they could handle for various 
operations or sets of operations.  However, in reading the jmeter docs, I'm 
not clear how one would replicate this type of testing.

Generally, in SLAMD, I could configure things to behave in the following 

1) Set up your distributed clients (as can be done with jmeter)

2) Configure the overall system so that each client starts out with 
initially 1 thread executing your task, with W amount of warmup time, and 
then a duration of X, with a cooldown of Y seconds afterward. Then 
increment the thread count on each client, and do the same thing.  You do 
this until there is no performance increase for at least Z iterations. 
Once that maximum is determined, re-run the best iteration for a configured 
amount of time.

For example, here's the generated report for a stress test I ran several 
years ago:


In this case, I had 16 distributed clients.  They would run each iteration 
for 1200 seconds, with a 30 second delay between iterations.  They would 
start with 1 thread each, and no specified maximum (since this is 
controlled by the improvement counter).  There was a warmup of 60 seconds 
before statistics would be gathered and the iteration timer was started and 
a cooldown of 60 seconds at the end of the iteration.  Statistics are 
collected every 10 seconds.

Does anyone have pointers and/or documentation that would allow me to set 
up a similar sort of stress test with jmeter as slamd was abandoned several 
years ago?



Quanah Gibson-Mount
Product Architect
Symas Corporation
Packaged, certified, and supported LDAP solutions powered by OpenLDAP:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message