perl-modperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From André Warnier ...@ice-sa.com>
Subject Checking a webserver
Date Wed, 28 Jan 2009 10:21:31 GMT
Hi.

I am looking for a debugging tool that would be able to repeatedly issue 
HTTP requests to one or more URLs, and log any errors.  Preferably in 
Perl, and it must run on a Windows workstation.

So something like a wget or curl, but with some smart command-line 
parameters allowing it to run in a continuous loop, with an adjustable 
request-issuing rate. Since I know exactly what kind of pages I should 
receive in return, I might want to modify this tool (if it is in Perl), 
to add some HTML parsing looking for specific information on the 
response pages.

I could write something myself using LWP of course, but I figure that it 
is most likely that someone already did this.
My main problem is that I don't know how something like that would be 
called or which keywords to look for.

I would appreciate any pointers, even just the keywords to search.
Thanks in advance.


Mime
View raw message