lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Moylan <jo...@rte.ie>
Subject Re: Downloading Full Copies of Web Pages
Date Wed, 20 Oct 2004 12:54:30 GMT
"wget" does this. Little point in reinventing the wheel.

Luciano Barbosa wrote:
> Hi folks,
> I want to download full copies of web pages and storage them locally as 
> well the hyperlink structures as local directories. I tried to use 
> Lucene, but I've realized that  it doesn't have a crawler.
> Does anyone know a software that make this?
> Thanks,
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: lucene-user-unsubscribe@jakarta.apache.org
> For additional commands, e-mail: lucene-user-help@jakarta.apache.org
> 

******************************************************************************
The information in this e-mail is confidential and may be legally privileged.
It is intended solely for the addressee. Access to this e-mail by anyone else
is unauthorised. If you are not the intended recipient, any disclosure,
copying, distribution, or any action taken or omitted to be taken in reliance
on it, is prohibited and may be unlawful.
Please note that emails to, from and within RTÉ may be subject to the Freedom
of Information Act 1997 and may be liable to disclosure.
******************************************************************************

---------------------------------------------------------------------
To unsubscribe, e-mail: lucene-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: lucene-user-help@jakarta.apache.org


Mime
View raw message