httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jacob Coby" <>
Subject Re: [users@httpd] Service Attackes hanging the Web Server
Date Tue, 01 Oct 2002 16:32:01 GMT
> Error_log.
> [Mon Sep 30 17:01:28 2002] [error] [client] File does not
> exist: /a
> pache$common/htdocs/robots.txt
> [Mon Sep 30 17:05:09 2002] [error] [client] client sent
> HTTP/1.1 r
> equest without hostname (see RFC2616 section 14.23): /
> [Mon Sep 30 17:06:05 2002] [error] server reached MaxClients setting,
> consider r
> aising the MaxClients setting
> [Tue Oct  1 00:03:10 2002] [notice] caught SIGTERM, shutting down
> Any ideas of how I might stop the Service attacks from disabling the
> Servers?

Your access log and error log entries don't match.  Ignoring that, how do
you know you are getting attacked?  robots.txt is usually used by web
spiders, including search engines to keep the from crawling too deep into
your website.

Write a robots.txt, even if it's blank or funny.  At least you won't be
logging a 404 error anymore.

You could try raising MaxClients until you hit a mem limit (you don't want
apache to start swapping).  You can also try lowering KeepAliveTimeout and
MaxKeepAliveRequests, and Timeout values to avoid one person from hogging
all of your httpd processes.

Contact whoever owns the IP addresses of the attackers.

Write or find a smart log analyzer that checks for malicious activity and
logs it.  If the same ip address is found guilty of attacking more often
than is acceptable, disable all tcp traffic from them for a period.  They'll
think they've dos'ed you, and you get to continue on status-quo.  Optionally
write a tool to email the owner of their netblock.

Attack them back.  Imagine their suprise :) Too bad it's illegal.


The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message