jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deepak Shetty <>
Subject Re: Odd problem with performance of JMeter
Date Wed, 22 Jun 2011 20:02:17 GMT
you can load the resultant jtl file in any of the standard listeners to see
the same output that you see
You can specify in whatever you need to be saved in the
jtl file
The constant throughput timer does not behave intutively - i believe the
constant throughput shaping timer is better from jmeter-plugins


On Wed, Jun 22, 2011 at 12:58 PM, John Lussmyer <>wrote:

> I hadn't noticed the Constant Throughput Timer, that does even things out a
> bit.
> This isn't formal perf testing, just making sure that our latest changes
> haven't completely hosed up performance.
> What I really need after running this for X amount of time is what is
> displayed by an Aggregate Report - which is a listener - which it's
> suggested I not use.  Makes it more difficult to get the values!
> Also, when I run from the command line with the -l switch, the (one and
> only) Response Assertion is spewing XML to the file, and I haven't found a
> way to turn that off.
> I can get the Aggregate report to output CSV, but that data doesn't seem to
> be what is displayed by the report.  I'd LIKE to see the values displayed in
> the output file.
> Using the Constant Throughput timer, I haven't been able to drive the App
> server CPU usage up much at all.
> Using the old Constant Delay timer I was able to do so.
> Constant Throughput has been a bit odd, both the JMeter machine and App
> server machines run at nearly the same CPU usage, and track each other.
>  There will be short (minute or two) bursts where they both go up to about
> 17% usage, but then they both drop back down to around 9% for a while.  In
> these cases, the response times do seem to vary somewhat, but since I
> haven't been able to log them properly yet, I can't be sure of the
> correlation between response times and cpu usage.
> (and yes, I know CPU is only one of the factors.  In this case, it's the
> one we know will be a problem.)
> -----Original Message-----
> From: Oliver Lloyd []
> Sent: Wednesday, June 22, 2011 11:58 AM
> To:
> Subject: Re: Odd problem with performance of JMeter
> Hi John,  When running tests like this you need to think in terms of
> throughput rather than total transactions achieved. So you mention you are
> aiming to get 800,000 requests but actually you want to define a throughput
> rate and have this as your target. Then, the volume of requests is simply a
> matter of time, the longer you run the test the greater the volume.
> So, you should setup your test plan to achieve a rate of requests per
> second. You can do this using the Constant Throughput Timer. Try spreading
> the requests over more threads, this will increase concurrency. The actual
> concurrency is also something you may want to think about - what is the
> expected concurrency that your need to prove is possible?
> Keep in mind when you are doing this that the response time for each
> request
> will be a factor in how many requests each thread can deliver, if it takes
> 1
> second to process each request then the maximum throughput per thread is 60
> per second. It's basic math.
> Now, once this is place you can begin to run some tests. Try starting at a
> lower value and evaluating your system at this steady rate - do you see the
> same drop off? The key here is to hold the request rate at the desired
> level
> and not let the test run as fast as it can. If you let things run wild
> without any control it makes debugging issues much harder.
> If, you are able to maintain a steady rate without degradation at, say, 50
> requests per second, then you could try a higher rate, or you could even
> try
> a gradual ramp up. Keep this slow, there's no point hammering your system
> hard right from the start as this will produce unrealistic data.
> When running tests you should keep an eye not just on CPU usage but also
> lots of other things. Memory usage on the JMeter box is something that
> cause
> issues but there are also a wide range of metrics from your system that can
> also cause the behvior you mention.
> Make sure you use an Assertion to verify that the response you are getting
> back is in fact what you expect it to be. You might find that you are
> getting errors but JMeter is displaying success because it sees a 200
> response - this is a very common error.
> Finally, as Deepak mentions, using listeners are high request rates can
> create a bottleneck in the test rig and typically it is better to run the
> actual load tests from the command line but you first need to prove that
> this is your issue and you can do this by running a few simple experimental
> tests at differing rates. If you can show an issue on the test rig then the
> best option might be to distribute the load but before doing this you need
> to establish that a steady rate - over time - is possible from one machine.
> If this is not possible even at lower rates then you've very likely got a
> application issue (although it is odd that you mention the response times
> remain constant.)
> Think of it like a scientific experiment, put a white coat on and use the
> word 'logical' a lot.
> --
> View this message in context:
> Sent from the JMeter - User mailing list archive at
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:
> This message and the information contained herein is proprietary and
> confidential and subject to the Amdocs policy statement,
> you may review at
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message