jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ansonyc <>
Subject The more remote servers I use, the less throughput I get
Date Fri, 11 Apr 2008 21:51:13 GMT


I'm starting to use remote servers but having trouble understanding its
throughput behaviour and how it distributes the threads in the Test Plan..

I have a Test Plan that achieves about 10 requests/second run locally
against a target system, when using 20 threads.  I take this same Test Plan
with the same number of threads and run it on a single remote server,
controlled by the original machine.  I get about 5 requests/second, about
half the total number of Samples in the same period of time.  The response
times are similar, and Uniform Random Timer settings are identical, so it's
almost as if the remote server is running 10 threads rather than 20.

I then add a second remote server and run the same Test Plan remotely over
those two.  Strangely, I get about 2.5 requests/second in total!

I'm observing requests per second through a Summary Report, and an Aggregate
Report.  I also happen to have a Graph Results running.

Here's the kicker: just to make sure I'm not running into capacity problems
of the specific remote machine, I copy my .jmx script directly over to one
of the remote servers and run it LOCALLY from there.  I get the full 10
requests per second I'm expecting.

View this message in context:
Sent from the JMeter - User mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message