jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Marc Chiarini (Tufts)" <marc.chiar...@tufts.edu>
Subject Gaussian Random Timer behavior
Date Sun, 09 Aug 2009 20:07:32 GMT
Hi Folks,

I am trying to verify that I am looking at my "problem" from the correct 
perspective.  I have a single thread group and a http sampler that 
requests the same file repeatedly for some user-defined duration (say 
300 seconds).  The sampler has a Gaussian Random Timer as a child with 
offset and std dev set manually (e.g., 500ms offset and 50ms deviation), 
in order to rule out any weirdness using variables.  When I run the test 
with one thread, I obtain my desired behavior: the delay between 
requests is normally distributed with mean and std dev very close to the 
timer settings.  However when I use 2 or more threads, I get delay 
distributions that are not fitting any particular distribution and are 
certainly not normal.  As I go higher with the number of threads (and 
correspondingly increase the offset and std dev, just to make sure the 
center mass is sufficiently distant from zero), the distribution starts 
to look exponential.  Am I missing something?  If it is true that the 
sum of two or more normal distributions is itself a normal distribution, 
then I don't understand why using multiple threads in this fashion 
results in this behavior.

Any help is greatly appreciated.

Regards,
Marc

PS  I reproduced the generation of normally distributed delay times 
using a Beanshell timer, but obtain similar results.



Mime
View raw message