httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chase <xan...@juun.com>
Subject [users@httpd] lots of near-simultaneous requests the same file
Date Thu, 28 Jul 2005 22:43:23 GMT
i have a tiny little cgi script (one line of code) that will be  
called at least once a minute by at least 1000 users of a piece of  
software that i'm developing.

the user can, if they want, set the refresh time to up to an hour,  
but let's consider the worst-case scenario for the purpose of this  
question.

is there any benefit to creating multiple, identical copies of the  
same cgi script and have the software randomly select which one it  
will fetch?

something like:

     /cgi-bin/myscript01.cgi
     /cgi-bin/myscript02.cgi
     /cgi-bin/myscript03.cgi
     /cgi-bin/myscript04.cgi
     /cgi-bin/myscript05.cgi
     /cgi-bin/myscript06.cgi
     /cgi-bin/myscript07.cgi
     /cgi-bin/myscript08.cgi
     /cgi-bin/myscript09.cgi
     /cgi-bin/myscript10.cgi


i see this sort of thing from time time on the internet, but i've  
never heard anyone explain why someone might do this.  i always just  
assumed that the implication is that a single file maybe has some  
limitation on the number of times it can be simultaneously opened.

i don't mind setting it up like this, but it would of course be  
simpler to leave it as a single script.

can someone elaborate on this subject please?

thanks.

- chase



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Mime
View raw message