httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "William A. Rowe, Jr." <wr...@rowe-clan.net>
Subject Re: Need a new feature: Listing of CGI-enabled directories.
Date Fri, 31 May 2002 04:23:44 GMT
At 09:45 PM 5/30/2002, you wrote:

>I am directing this message to the developer's list because I strongly
>suspect that it may require some new development.
>...
>The first step in finding all such scripts however may often be the most
>difficult one.  That first step consists of simply gathering into one
>big list a list of all of the CGI-enabled directories on the local web
>server.  Once such a list has been compiled, other standard UNIX tools
>such as `find' and `file' and `grep' and be set to work, plowing through
>all of the files in those (CGI) directories and finding all of the bad
>FormMail scripts.

I've been thinking along this line for a different reason [not just limited
to the cgi enabled bit.]  However, I have no time to implement at the
moment, but I'll share my thoughts.  Many, many users have trouble
figuring out

   1. why their scripts don't run
   2. how the +/- options syntax was really parsed
   3. how a given request will be handled

What about an htls command [httpd ls file listing utility] that would
actually use the core Apache code to run directories and files through
the Location, Directory and Files tests?

It doesn't have to be blindingly fast [it isn't the daemon itself, it's an
administrator's tool] and it can be rather simple.

One complexity is the difference introduced by <Location> and
<VirtualHost> directives.  My original though was to list the files, but
those listings are only complete in the context of a host.

If we take the logic in mod_autoindex one step further, and set up
whatever hooks through mod_alias to 'query' the possible matches
under a given pattern, we should be able to make this reasonably
complete.  Perhaps the same sort of hook would work for mod_rewrite,
although the logic to "work those backwards" is probably as complex
as the rewrite code itself.  Recursion is one complication, depending
on how the patterns interact.

Anyway, I'm not suggesting that htls would ever be a web app!!!
This would be a local tool, similar to mod_info, only providing a list
of resources and the flags that affect them.

One might invoke the utility with;

   htls -r www.apache.org/dist/

and using whatever abbrieviations, you might end up with something like
IN(D)EXES
(I)NCLUDES INC(N)OEXEC
(S)YM_LINKS OPT_SYM_(O)WNER
(E)XECCGI
(M)ULTI 128

D--SO-M  www.apache.org/dist/project1/
D--SO-M  www.apache.org/dist/project1/file.html
D----EM  www.apache.org/dist/project1/search.pl
D--SO-M www.apache.org/dist/project2/
...
You could further extend the concept to include the optional listing of
the handler toggled by such a request, etc.

BTW - you might also want to consider simply reviewing your log files.
It would be really trivial if you could add the handler invoked (or a flag
to indicate that a given filter were invoked) to the logs, so you could
look specifically for requests that trigger mod_cgi.  It might be a
straighter line to identifying the risks you are concerned with.


Mime
View raw message