tomcat-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From André Warnier>
Subject Re: Tomcat access log reveals hack attempt: "HEAD /manager/html HTTP/1.0" 404
Date Mon, 15 Apr 2013 17:15:11 GMT
Neven Cvetkovic wrote:
> How about creating a fake manager application :)))
> That takes X minutes/seconds to get back a 404 ;)))

Just for the sake of the discussion :
- a fake manager application would apply to just the /manager webapp, not to other 
potential hacking targets, no ? (or you would have to "map" it to any potential hacking 
URL, which may be inconvenient).  Also, you'd have to duplicate this webapp in any 
configured <Host>.
- the fact that it is a genuine webapp would mean that during the delay before the 404 
response, at least one tomcat thread remains blocked executing that application, for each

such request.  I was thinking more in the direction of off-loading such 404 responses to 
some specialised lightweight thread, using as little resources as possible.  It wouldn't 
really matter if there is a queue of such responses waiting to happen, as they just delay

the eventual response to the (miscreant) client(s).

More ideas ?

P.S. I'd love to see this as a standard Tomcat feature, because it would mean that within

a certain time period, thousands and thousands of Tomcat servers on the Internet would 
become annoying for these hacking programs.  If it was a webapp that everyone has to 
deploy on individual tomcat servers optionally, it would be much less effective, I think.

Of course at the moment I am just fishing here for potential negative side-effects.

Provided the idea makes sense however, I believe that I would also post it on the Apache 
httpd list.  If it was adopted there also somehow, that could have quite a global impact.

One potential negative side-effect that I can see, is on one of my own programs (or 
similar ones) : for some customers, I created a "URL checker" program, which goes through

their databases looking for third-party links, and gives them a list of the ones that are

not working (so that they can correct their data).  Of course if all webservers on the web

implemented my idea, then it would take much longer for this genuine utility program to 
run, because it would experience an extra delay for each incorrect URL (in case the host 
is correct, but the URL on that server is not).
I'm also quite sure that Google won't really like the idea..

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message