jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adrian Speteanu <asp.ad...@gmail.com>
Subject throughput not constant in test
Date Fri, 07 Dec 2012 14:45:38 GMT
Hi guys,

I have encountered a very weird scenario and I just can't make any sense
out of it: the throughput of similar requests in the same thread group is
different. Please suggest me a course of action.

Problem: No matter how I configure the test bellow, requests "sampler 2-x"
have a lower throughput than 1-x, although the response times are very
similar.

Test Config w. all value used in most relevant repeats of this test:
|_thread group
        [ 200-2000-4000 threads; ramp 20-200-400; loop forever ]
   + [ loop controller = 10 ]
         |_ sampler 1-1 (httpClient4, POST, follow redirects; Keep-Alive is
unticked)
         |_ sampler 1-2 (httpClient4, GET, redirect automatically,
Kee-Alive is unticked)
         |_ Header Manager [ Accept: application/json ]
   + [loop controller = 10]
         |_ sampler 2-1 (httpClient4, POST, follow redirects; Keep-Alive is
unticked)
         |_ sampler 2-2 (httpClient4, GET, redirect automatically,
Kee-Alive is unticked)
         |_ Header Manager [ Accept: application/json ]
|_ Constant Throughput Timer [ disabled / enabled; target = 3000-30.000 per
minute; sharing = tried all BUT "this thread only"]
|_ Aggregate Graph
    ( also used other listeners, but disabled them after noticing the
problem so they don't have influence...)

Note: Apart from the structure above, I also make appropriate assertions on
response code and on the response headers (if I asked for json, i validate
that Content-Type: application/json is present). And that's it.

Note2: Practically the difference between the samplers is only that 1-x get
json responses while 2-x get xml responses...

---------------------

Results of tests that don't use the constant throughput timer: for all
samplers, median = 258-262ms; 90% line = 298-302ms, error rate = 0%.
However, throughput of samplers that return xml is smaller by a 10% margin
in all repetitions of the test. Even if test runs for long amount of times,
there is still a significant difference between them, even if response
times are almost identical.

Results of tests that use the constant throughput timer:  for all samplers,
median = 258-262ms; 90% line = 298-302ms, error rate = 0% (almost identical
as tests above). However, throughput of samplers that return xml is
extremely smaller than that of the others (they represent only 1-10% of
total samplers made).

At all repetitions of the tests, the [jmeter] test machine uses 4-6% CPU
and heap size is 500-800M (depending only on length of test).

I've made different thread groups with the two sets of requests, added the
CTT as child to the controllers or directly to the samplers. The results
were similar.

--------------------

First result might make sense, the gap might be caused by the fact that
requests 1-x start earlier and their group is larger. Didn't do all the
math, am not sure about it - I only ran it because I noticed the issue
using the constant throughput timer and I wanted to see if that is the
cause of the problem.

Second result group is most surely very very wrong. I can't explain why so
similar requests get treated differently. The application seems relatively
stable, especially at lower loads and the problem is still there - so I
assume its an issue with this use-case (and the application seems to not be
visibly more overloaded by XML responses in comparison to JSON - which was
expectable before we ran the test).

-------------------

Any feedback is welcomed at this point.

--Adrian S

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message