jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jörg Godau <>
Subject understanding aggregate report throughput values
Date Tue, 27 Jul 2010 14:17:59 GMT
Hi All,

we have a fairly simple test that logs in to our application.

We've setup a Gaussian Random Timer and are monitoring the results in an Aggregate report.

My question is about the throughput - if we reduce the delays in the Timer, the time taken
to log in to the application increases (which makes sense as there is more load on the server).

Why is the throughput also increasing? If each request is taking much longer (ca. 10 times
as long) when we increase the load - shouldn't the throughput be lower?

Some numbers to illustrate:
Timer ave 30 sec / deviation 15 sec  => Average request  700ms, max   2760ms, throughput
Timer ave 10 sec / deviation  3 sec  => Average request 9610ms, max 114300ms, throughput

Can someone please explain how this is possible?

Mit freundlichen Grüßen
Jörg Godau

SCHÜTZE Consulting Informationssysteme GmbH Argentinische Allee 22b
14163 Berlin
Tel.: 030/ 802 49 44
Fax: 030/ 8090 39 95

Geschäftsführer: Klaus-Dieter Schütze
Registergericht: Amtsgericht Charlottenburg
Registernummer: HRB 73618
Umsatzsteuer-Identifikationsnummer gemäß § 27a Umsatzsteuergesetz: DE 813181239

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message