httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Douglas K. Fischer <fischer.d...@grantstreet.com>
Subject Re: [users@httpd] Service Attackes hanging the Web Server
Date Tue, 01 Oct 2002 17:53:09 GMT
This is the Slapper OpenSSL worm. Note the "GET / HTTP/1.1" request. 
This is the initial probe by the worm to determine your Apache version. 
On systems which are not compromised by the worm's exploit (e.g. the 
wrong memory offset is used), the child processes can crash/hang due to 
the corruption of memory from the exploit attempt.

The robots.txt requests are unrelated.

The IPs are not being spoofed, to the best of my knowledge. They are 
compromised hosts on which the worm is running.

Upgrading your version of Apache/mod_ssl/OpenSSL to a non-vulnerable 
version should prevent the child process crashes/hangs, though the worm 
will still probe you and attempt to exploit you (though OpenSSL will 
catch and log the exploit attempt instead of it crashing/hanging).

Check out cert.org, isc.incidents.org, etc for more information.

Doug
__________________________________
Douglas K. Fischer
Linux Systems Administrator & Programmer
Grant Street Group, Inc.

On Tuesday, October 1, 2002, at 12:52 PM, Denise Pederson wrote:

> I guess my example was a little misleading. First of all they the IP 
> addresses are being Spooffed each time they come from a different 
> legitimate address. Second they do not always try to access the 
> robots.txt first. Generally we just have to look for the - 408 - 
> message to find out what time everything hung up.
>
> 65.69.158.242 - - [01/Oct/2002:01:21:55 -0600] "GET / HTTP/1.1" 400 378
> 65.69.158.242 - - [01/Oct/2002:01:27:58 -0600] "-" 408 -
>
> [Tue Oct  1 01:22:53 2002] [error] server reached MaxClients setting, 
> consider r
> aising the MaxClients setting
> [Tue Oct  1 01:27:59 2002] [notice] child pid 20c1e296 exit signal Bad 
> system ca
> ll (12, 0x1000000C)
> [Tue Oct  1 03:01:07 2002] [error] [client 216.39.48.116] File does 
> not exist: /
> apache$common/htdocs/robots.txt
> [Tue Oct  1 03:50:05 2002] [error] [client 209.73.164.50] File does 
> not exist: /
> apache$common/htdocs/robots.txt
>
> Jacob Coby wrote:
>
> Error_log.
> [Mon Sep 30 17:01:28 2002] [error] [client 66.196.73.80] File does not
> exist: /a
> pache$common/htdocs/robots.txt
> [Mon Sep 30 17:05:09 2002] [error] [client 65.205.158.10] client sent
> HTTP/1.1 r
> equest without hostname (see RFC2616 section 14.23): /
> [Mon Sep 30 17:06:05 2002] [error] server reached MaxClients setting,
> consider r
> aising the MaxClients setting
> [Tue Oct  1 00:03:10 2002] [notice] caught SIGTERM, shutting down
>
> Any ideas of how I might stop the Service attacks from disabling the
> Servers?
>

Mime
View raw message