jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marc Esher <marc.es...@gmail.com>
Subject Re: Load testing, Continuous Integration, failing on build-over-build degradation
Date Tue, 16 Jul 2013 22:35:27 GMT
Good thoughts, Adrian. Thanks for that advice.


On Tue, Jul 16, 2013 at 12:24 PM, Adrian Speteanu <asp.adieu@gmail.com>wrote:

> I understand now, never thought of it. I just look at the older graphs and
> compare (because it plots all available graphs, and I keep some of the old
> results for a while), but don't fail the build automatically, they're side
> by side.
>
> If I wanted to do that, there are various hacks that could be used.
> Duration assertions, for once, but if you are looking for thresholds for
> the entire data set, then maybe with beanshell. But it should be the only
> sampler that fails if you want to be notified. I was thinking of
> transaction controller too, but there are too few usecases where this is
> useful.
>
> No pretty way to do this, duration assertion should be ok, but might be
> tricky. Let's explore this: you set a threshold which is quite close to the
> values obtained at the moment when creating this regression test script. If
> the difference between the threshold value and the expected/current
> response time is smaller than the standard deviation, then the duration
> assertion will fail requests that are in the normally distributed set.
> That's bad. You also have to tweak the expected error percentage threshold
> that we thought was the feature you were looking for. Doable, not pretty at
> all.
>
> Cheers,
> Adrian S
>
>
>
> On Tue, Jul 16, 2013 at 5:10 PM, Marc Esher <marc.esher@gmail.com> wrote:
>
> > So to be clear: that's simply detecting errors that rise above a certain
> > threshold. But currently, there's no way to track performance degradation
> > over time, correct?
> >
> > What I want is an automatic way to spot degradation job-over-job, such
> that
> > Jenkins would realize "Your tests are now 10% slower than they were a
> week
> > ago".
> >
> > Or is that asking for too much, and perhaps for trouble?
> >
> >
> > On Tue, Jul 16, 2013 at 10:05 AM, Cedrick Johnson <
> > cjohnson@backstopsolutions.com> wrote:
> >
> > > This is contained in Jenkins. I don't know about Maven and setting that
> > > up. We are using the standard JMeter plugin within Jenkins. If you
> > activate
> > > it that should work. Here's my Build step in Jenkins (Execute Shell and
> > > yeah, we're still on 2.8)
> > >
> > > rm -f *.jtl
> > > $HOME/apache-jmeter-2.8/bin/jmeter -n -t SomeTestPlan.jmx -JServerName=
> > > wee.com -JServerPort=8080 -JUserThreads=50 -JUserLoopCount=1 -l
> > > RhubarbTestResults.jtl
> > >
> > > That works for us, and has caught some pretty big design changes that
> > > slowed things down.
> > >
> > > -c
> > >
> > > -----Original Message-----
> > > From: Shmuel Krakower [mailto:shmulikk@gmail.com]
> > > Sent: Tuesday, July 16, 2013 1:29 AM
> > > To: JMeter Users List
> > > Subject: RE: Load testing, Continuous Integration, failing on
> > > build-over-build degradation
> > >
> > > Hi Cedrick
> > > Thanks for sharing but is this post build action is part of the maven
> > > plugin or part of jenkins?
> > >
> > > I am looking for exactly this capability for couple of months now! Can
> > you
> > > point on any link to brief introduction of this as I couldn't find any.
> > >  On Jul 15, 2013 10:01 PM, "Cedrick Johnson" <
> > > cjohnson@backstopsolutions.com>
> > > wrote:
> > >
> > > > When you configure your JMeter Jenkins job, in Post-Build actions you
> > > > can have it publish the performance test result report which points
> to
> > > > the Test Results .jtl file that is generated when running the test.
> In
> > > > that report, there's a Performance Threshold section where you can
> set
> > > > it to identify when the build is unstable (number of errors exceeds
> > > > this percentage
> > > > amount) or build Failed when the number of errors exceeds this set
> > > amount.
> > > >
> > > > The errors are determined in your actual load test, i.e. if requests
> > > > start timing out, or other conditions that you are checking in your
> > > > tests begin failing they will count against this threshold and
> Jenkins
> > > > will alert you to a degradation in performance once those errors are
> > met.
> > > >
> > > > -c
> > > >
> > > > -----Original Message-----
> > > > From: Shmuel Krakower [mailto:shmulikk@gmail.com]
> > > > Sent: Monday, July 15, 2013 1:54 PM
> > > > To: JMeter Users List
> > > > Subject: Re: Load testing, Continuous Integration, failing on
> > > > build-over-build degradation
> > > >
> > > > Hi Adrian
> > > > Thanks for sharing but how exactly u control the response times
> > > > thresholds or error rates?
> > > > I cannot find any control of this...
> > > >  On Jul 15, 2013 4:26 PM, "Adrian Speteanu" <asp.adieu@gmail.com>
> > wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > Check my attempt of an answer bellow.
> > > > >
> > > > > Regards,
> > > > > Adrian S
> > > > >
> > > > > On Mon, Jul 15, 2013 at 2:56 PM, Marc Esher <marc.esher@gmail.com>
> > > > wrote:
> > > > >
> > > > > > Greetings all,
> > > > > >
> > > > > > I'm integrating our load tests into our CI environment, with
the
> > > > > > goal of identifying performance degradation as soon as possible.
> > > > > > The idea is is
> > > > > to
> > > > > > use some kind of threshold, from one CI build to the next, to
> > > > > > identify
> > > > > when
> > > > > > performance has dipped to an unacceptable level from one run
to
> > > > another.
> > > > > >
> > > > > > I'm using Jenkins, currently.
> > > > > >
> > > > > > Anyone have any guidance, strategy, experience, wisdom here?
> > > > > >
> > > > > > The Jenkins Performance Plugin is decent for reporting trends,
> but
> > > > > > it has no capabilities to automatically spot problems.
> > > > > >
> > > > >
> > > > > What is your exact expectation regarding to this last phrase?
> > > > >
> > > > > I'm currently using the maven plugin, and it integrates nicely with
> > > > > the jenkins plugin that you mentioned. The tests fail when
> expected.
> > > > > Here are the configurations made to the pom.xml (I followed the
> > > > > tutorial from the jenkins plugin project when first setting up this
> > > > > test project). The threshold for failures are set in the jenkins
> > > > > plugin
> > > > and they work.
> > > > >
> > > > >                 <groupId>com.lazerycode.jmeter</groupId>
> > > > >                 <artifactId>jmeter-maven-plugin</artifactId>
> > > > > ...
> > > > >                 <executions>
> > > > >                     <execution>
> > > > >                         <id>jmeter-tests</id>
> > > > >                         <phase>verify</phase>
> > > > >                         <goals>
> > > > >                             <goal>jmeter</goal>
> > > > >                         </goals>
> > > > >                     </execution>
> > > > >                 </executions>
> > > > >
> > > > > execution: #mvn clean verify
> > > > >
> > > > >
> > > > > > Thanks!
> > > > > >
> > > > > > Marc
> > > > > >
> > > > >
> > > >
> > > > ---------------------------------------------------------------------
> > > > To unsubscribe, e-mail: user-unsubscribe@jmeter.apache.org
> > > > For additional commands, e-mail: user-help@jmeter.apache.org
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message