jmeter-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From André van Hoorn <Andre.van.Ho...@Informatik.Uni-Oldenburg.DE>
Subject Re: Performance Test Setup for a web site...
Date Wed, 11 Nov 2009 13:59:21 GMT
Hi,


> On 06/11/2009, Deepak Shetty <shettyd@gmail.com> wrote:
>> Hi
>>  one of the key problems that I face is simulating varying concurrent usage
>>  accurately. e.g. when the day starts out you want this to be low and hit
>>  peaks as people come into work and then drop off at lunch or whatever.
>>  Specifying different thread groups is one of the ways that you simulate this
>>  as you have pointed out.
>>  JMeter doesn't have a way to vary thread counts in a thread group easily ,
>>  you must setup the total number a priori and then use ramp ups or delays or
>>  whatever.
>>  Is there are any plans to be able to vary the number of active threads in
>>  the threadgroup at runtime?
>>  regards
> 
> Probably not in the next release, but perhaps after that.
> 
> You can use the Constant Throughput Timer in conjunction with a
> variable in order to vary the load, see for example:
> 
> http://jakarta.apache.org/jmeter/usermanual/best-practices.html#beanshell_server
> 
> I've used that in performance tests that ran over many hours.

maybe you also want to have a look at our JMeter plug-in Markov4JMeter
(http://markov4jmeter.sourceforge.net/) which allows to vary the
workload intensity (in terms of active users/threads) based on
mathematic formaluae (implemented based on JMeter's BeanShell
integration). We are using this in our research experiments to simulate
varying workloads.

Best regards,
André

> 
>> deepak
>>
>>
>>  On Fri, Nov 6, 2009 at 6:54 AM, Carl Shaulis <cshaulis@homeaway.com> wrote:
>>
>>  > Getting proportional scripts I can think of two approaches:
>>  > 1.  Scrap a days worth of log files for all of the requests coming to your
>>  > application servers then replay these requests using JMeter
>>  > 2.  Estimate what portion of your community does what, then use various
>>  > thread groups to emulate this traffic (Thread group one (ie-visit home
>>  > page)
>>  > 50%, ThreadGroup2 (ie. - Search) 40% etc...
>>  >
>>  > Number of virtual threads really is dependent on a combination of your load
>>  > servers and your target.  For example we have a 16 core machine as a load
>>  > generator and a an equivalent machine supporting our application.  Using
>>  > 100
>>  > concurrent threads and no sleeps the load machine was less than 10% CPU
>>  > utilization and the target machine was at 90% CPU utilization.  When then
>>  > cached a bunch of the requests and the target machine could respond faster
>>  > than the load machine so the Load machine CPU was stressed at the same
>>  > thread count, additionally we were almost at our bandwidth limits.  So
>>  > Deepak is spot on saying monitor machine resources.
>>  >
>>  > We have had success using MySQL to inspect data, but we have also kept our
>>  > load times brief (5 minutes).  If using a linux box to collect your data
>>  > you
>>  > can get a quick evaluation of throughput by using wc -l <filename> divided
>>  > by the test duration to give you TPS.
>>  >
>>  > Good luck!
>>  >
>>  > Carl
>>  >
>>  >
>>  > On 11/6/09 1:13 AM, "Deepak Shetty" <shettyd@gmail.com> wrote:
>>  >
>>  > >> I will create Scripts which will hit the site pages proportional to
>>  > there
>>  > > usage
>>  > > This isnt easy.
>>  > >
>>  > >> - What should be the configuration of the machine which will simulate
>>  > these
>>  > >> - If more then one test machine is required please specify there
>>  > > configuration?
>>  > >> - How many instances of jMeter we need to run for simulating 5000
users?
>>  > > My preference is Multiple low end machines running separate JMeter
>>  > instances
>>  > > to 1 big machine. It simulates the network better . The load you can
>>  > > generate depends on what else is running and what your tests actually
do.
>>  > > People have reported running 1000 threads from a single machine. In any
>>  > case
>>  > > , you must generate a load and check your client machine resources ,
>>  > > preferably the cpu shouldn't exceed 60-80% and memory used should all
be
>>  > RAM
>>  > > not virtual. You can increase the number of threads till you hit some
>>  > limit
>>  > > after which the client machine may become a bottleneck and give you
>>  > > incorrect results.
>>  > > See related
>>  > > http://wiki.apache.org/jakarta-jmeter/HowManyThreads
>>  > >
>>  > http://jakarta.apache.org/jmeter/usermanual/best-practices.html#lean_mean
>>  > > When running multiple jmeter instances you can either run each instance
>>  > > separately (my preference) or you can run Jmeter  in master slave (this
>>  > is
>>  > > more inefficient , check the mail archives).
>>  > >
>>  > >> How the result file output should be consolidated from various scripts?
>>  > If
>>  > >> someone is using any tool for consolidating the output files please
>>  > share
>>  > >> information regarding the same.
>>  > >> - Is there any tool for converting these files to some reports. I
have
>>  > > heard
>>  > >> that reports can be generated using some available xslt's in ant.
But
>>  > that
>>  > >> does not work with large output files. So please suggest alternatives.
>>  > > I assume you mean combining the results if you run jmeter instances
>>  > > separately. if you use CSV as your format, just concatenate. Mostly if
>>  > you
>>  > > have long running tests , you would load the CSV files into a database
>>  > > table. You have some listeners that can read the CSV/JTL files , but
>>  > you'd
>>  > > have to use a listener whose memory utlization is constant and not
>>  > > proportional  to number of samples (e.g. summary report -
>>  > > http://jakarta.apache.org/jmeter/usermanual/component_reference.htmlsection
>>  > > 18.3).
>>  > >
>>  > > regards
>>  > > deepak
>>  > >
>>  > >
>>  > > On Thu, Nov 5, 2009 at 2:54 PM, Harry_ <harjitworks@gmail.com> wrote:
>>  > >
>>  > >>
>>  > >> Hi,
>>  > >>
>>  > >> We need to do performance testing for our website simulating 5000
users
>>  > >> using jmeter and other open source tools. The following information
is
>>  > with
>>  > >> us:
>>  > >>
>>  > >> - A csv file containing links and number of times that link last month.
>>  > The
>>  > >> file is sorted according to popularity of page visited.
>>  > >> - With this file I can get information about average number of hits
per
>>  > >> unit
>>  > >> time say per minute. (estimate can be made regarding max load, min
>>  > load). I
>>  > >> will create Scripts which will hit the site pages proportional to
there
>>  > >> usage.
>>  > >> - All the requests will be http requests.
>>  > >> - Average size of the page will be 350 KB (including embedded objects
>>  > >> within
>>  > >> the page).
>>  > >>
>>  > >> Now we need answer to the following questions:
>>  > >>
>>  > >> - How should performance testing of the site simulating that much
user
>>  > load
>>  > >> using Jmeter should be done?
>>  > >> - What should be the configuration of the machine which will simulate
>>  > these
>>  > >> many users? (Someone suggested me 8 core@3 GHz, 16 GB machine).
>>  > >> - If more then one test machine is required please specify there
>>  > >> configuration?
>>  > >> - How many instances of jMeter we need to run for simulating 5000
users?
>>  > >> - How the result file output should be consolidated from various
>>  > scripts?
>>  > >> If
>>  > >> someone is using any tool for consolidating the output files please
>>  > share
>>  > >> information regarding the same.
>>  > >> - Is there any tool for converting these files to some reports. I
have
>>  > >> heard
>>  > >> that reports can be generated using some available xslt's in ant.
But
>>  > that
>>  > >> does not work with large output files. So please suggest alternatives.
>>  > >> - What other things should i keep in mind for doing the performance
>>  > test?
>>  > >>
>>  > >> We would appreciate if someone can answer these queries based on there
>>  > >> experience. .
>>  > >>
>>  > >> Thanks,
>>  > >> Harry
>>  > >> --
>>  > >> View this message in context:
>>  > >>
>>  > http://old.nabble.com/Performance-Test-Setup-for-a-web-site...-tp26223743p262
>>  > >> 23743.html
>>  > >> Sent from the JMeter - User mailing list archive at Nabble.com.
>>  > >>
>>  > >>
>>  > >> ---------------------------------------------------------------------
>>  > >> To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>>  > >> For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>>  > >>
>>  > >>
>>  >
>>  >
>>  > ---------------------------------------------------------------------
>>  > To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
>>  > For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
>>  >
>>  >
>>
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
> For additional commands, e-mail: jmeter-user-help@jakarta.apache.org
> 


-- 
Dipl.-Inform. André van Hoorn

DFG Graduate School on Trustworthy Software Systems (TrustSoft)
Department of Computer Science, University of Oldenburg
PO Box 2503, D-26111 Oldenburg, Germany

Room A2 2-224, Tel: +49 (0)441 798-2866, Fax: -2196
WWW:     http://www.trustsoft.uni-oldenburg.de/members/andrevh
E-Mail:  Andre.van.Hoorn@Informatik.Uni-Oldenburg.DE
PGP key: http://www.trustsoft.uni-oldenburg.de/members/andrevh#pgp

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: jmeter-user-help@jakarta.apache.org


Mime
View raw message