tomcat-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From André Warnier>
Subject Re: Tomcat access log reveals hack attempt: "HEAD /manager/html HTTP/1.0" 404
Date Tue, 16 Apr 2013 16:38:43 GMT
Mark H. Wood wrote:
> On Mon, Apr 15, 2013 at 07:15:11PM +0200, André Warnier wrote:
>> Neven Cvetkovic wrote:
>>> How about creating a fake manager application :)))
>>> That takes X minutes/seconds to get back a 404 ;)))
> [snip]
>> Of course at the moment I am just fishing here for potential negative side-effects.
> Search for "tarpit".  There should be a lot of discussion.

Thanks, that was a good tip.  Amd me thinking I had an original idea here...

However, I will keep insisting on my main point : all the tools which I have read about so

far are great, and much more sophisticated than my crude 404-delay proposal.  But they all

require additional components, and quite a bit of dedication and expertise, in order to 
take advantage of them.  And, when installed somewhere, they only protect the systems on 
which they are installed.
Which, given the relative complexity of doing so, is likely to remain a small proportion 
of webservers on the www.

Which still leaves it "economical" for these scanning bots to keep scanning, because they

will still find enough vulnerable servers within a reasonable amount of time spent scanning.
After all, these tools have existed for a while already, and my servers are still being 
scanned every day, so it must still be worthwhile to do so, right ?

My point is that if this were really simple to install (such as installed by default, and

optionally disabled) then over time a large proportion of webservers would implement the 
scheme, and this would make it uneconomical for these bot scanners to scan, because it 
would take a prohibitive amount of time and/or resources, compared to the number of 
vulnerable servers to be discovered.

And then, because it would be uneconomical, one might hope that this line of attack would

just disappear.  And then *all* servers would benefit, even the ones which have never been

"vaccinated" in the first place.

In order to determine the potential benefits or pitfalls of something like this, it is 
often useful to make some back-of-the-envelope calculations, to get a rough idea.
I have done so, and although there is probably much to criticise in my methodology, the 
results tend to suggest something like this :
- given a bot that scans 100,000 IP addresses to try to find servers running webservers 
which have one of these vulnerabilities, with a guessed 2500 such vulnerable servers among

them :
   - without a delay on the 404 responses, it would take this bot about 2 1/2 hours to 
find the 2500 vulnerable servers among the original 100,000 IP addresses
   - with a 1 s delay for 404 responses, it would take that same bot 93 hours for the same

That is 40 times longer.

Or, another way of looking at this would be that for every 40 servers scanned without a 
404 delay, the same bot infrastructure within the same time would only be able to scan 1 
server if a 1 s 404 delay was implemented by 50% of the webservers.
In other words, now 39 of each 40 servers would be left alone, whether they implement the

404 delay or not.

I am willing to submit my calculations for criticism or demolition.  But it seems to me 
that the kind of numbers above should at least warrant some interest from the 
powers-that-be, for a parameter that might be as simple as
<Connector delay404="1000" ...>
(of course, I am not talking about the implementation)

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message