httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jonathan Zuckerman <>
Subject Re: [users@httpd] Serving partial data of in-memory common data set
Date Wed, 29 Jul 2009 18:40:08 GMT
On Wed, Jul 29, 2009 at 11:18 AM, S.A.<> wrote:
>> ...
>> > We started with 30 concurrent users and there was no
>> trouble,
>> > but when the next batch of 70 users hit concurrently,
>> we could
>> > not serve all the users.
>> >
>> Just a note for now.  The above is an important part,
>> and maybe you should have started your initial post with
>> that information.
>> It shows that you have a running system, that you have
>> tried your application, and that you have already seen a
>> problem.
>> I know it is a matter of presentation, but it puts
>> everything in a different light and is more likely to
>> attract attention and helpful responses, than the way you
>> presented your initial post, which looked more like you were
>> looking for theoretical answers to hypothetical questions.
> May be I should have. I was debating between a long post
> and a summary and in the interest of time, tried to post
> what I was looking for.
>> >
>> > We have about about 10 set of 50 pages and each page
>> containing
>> > varying number of images (on an average 45 images of
>> about 2KB
>> > and about 50 or so 0.5KB images) and an occasional
>> multi-media
>> > file. For this discussion, we can ignore the
>> multi-media file.
>> > The page content and the sequence of page presentation
>> is
>> > user dependent, but for a group of users this 50 page
>> set
>> > is constant. With each request we do updates on 3
>> mysql tables.
>> >
>> The last sentence is also something worth investigating.
>> Such as : is the bottleneck at the level of the images, or
>> at the level of the mysql accesses ?
> The reason why I am not suspecting mysql was that the mysql
> log does indicate that it is getting all the requests and
> it is servicing them. As I have stated before, some of the
> users though are not getting images.
>> ...
>> > Our eventual objective is to serve about 200,000 users
>> (of
>> > course not with our existing hardware) and we are
>> looking
>> > at various options.
>> That is also a good number to give, as it gives a measure
>> of the goal.
>> ...
>> >
>> > We have definitely a problem and looking at various
>> options
>> > to resolve it.
>> My first reaction to the above would be that yes, you do
>> have a problem.  If you cannot serve the kind of
>> content you mention for 70 users, with the hardware you
>> mention, then something seems definitely amiss.
>> I cannot imagine that you could have a problem serving 70
>> clients simultaneously, with pages containing 45 images of 2
>> KB each.
>> My gut feeling tells me that you may have a concurrency
>> problem at the level of your mysql accesses.
>> Would it be possible, for instance, to replace the mysql
>> accesses by some static content fetched from disk (for
>> example, 1 distinct text file for each of your client-ids,
>> the name being client-id driven) ?
>> And then see if you have a problem ?
> There is very little data that gets served to the user from
> mysql. We use mysql to maintain statistics on what content
> user is accessing and the sequence in which the info is
> presented as the order matters to us. So, it is predominantly
> updates that we do to the mysql. Of course, updates are more
> latency intense than reads, yet those operations seem to
> get thru.
>> The reason for that gut feeling is that you wrote somewhere
>> that you can serve up to a certain number of clients fine,
>> but when increasing the number, you hit a wall.
> We have looked at mysql aspect and will continue to look
> at it. Hardware is another issue we need to resolve and
> then may be memcached.
> Thank you again for your responses.
> ---------------------------------------------------------------------
> The official User-To-User support forum of the Apache HTTP Server Project.
> See <URL:> for more info.
> To unsubscribe, e-mail:
>   "   from the digest:
> For additional commands, e-mail:

This might not be an option if image access varies based on privilege
of user (or if every user has unique images), but have you considered
using a spritemap for the images?  50 http requests can be

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message