httpd-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rob Hartill <hart...@ooo.lanl.gov>
Subject Re: indexing suggestion
Date Tue, 11 Apr 1995 17:41:49 GMT
 
>    From: Rob Hartill <hartill@ooo.lanl.gov>
>    Date: Mon, 10 Apr 95 10:37:48 MDT
> 
>    Have httpd parse ALIWEB index files, and return formtted output.
> 
> ... um... why not just parse the thing once, when it's built, and
> serve the output as an ordinary file?

I'm suggesting we have the server search the index file when it
is requested with arguments. Without arguments it prompts for some.

So I could hit some site with a link to their index, and be able to
type in a keyword. It would then give me a formatted list of pointers
to what I probably wanted.

Now it may be that ALIWEB doesn't have the ideal syntax for this, but
maybe we can define some kind of local index file that better suits this
idea.

The index files, being of a special MIME type could be placed in 
lots of directories, so that the index will be specific to that
region of the server. A top level index could point directly to
resources or to lower level indicies.

A simple approach could be to have a format such as

#comment
URL
keywords
description


e.g.
#let's index my new game
/Robs_junk/new/game.html
game,entertainment,hangman,fun
A www verstion of the classic hangman game
# i stole the hangman code form Fred's site
http://fred.com/cgi-bin/hangman
game,entertainment,hangman,fun,fred
The original version which I based <A HREF="/Robs_junk/new/game.html">my hangman game
on</A>



HTML doesn't give a damn about \r\n so, the syntax could just be 1 
field per line, with an unlimited line length.


Robots could be incouraged to request the source of the index, so that
they do a proper job of indexing the web.

thoughts ?



Mime
View raw message