nutch-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alvaro Cabrerizo" <topor...@gmail.com>
Subject Re: Re-injecting URLS, perhaps by removing them from the CrawlDB first?
Date Thu, 02 Nov 2006 09:02:53 GMT
Hi:

At least in nutch 0.8.X you can filter the crawldb, linkdb and segments
using URL patterns (e.g. editing  regex-urlfilter.txt). Just execute this
command:

"nutch_home/bin/nutch mergedb"

It will answer the usage information:

CrawlDbMerger output_crawldb crawldb1 [crawldb2 crawldb3 ...] [-filter]
        output_crawldb  output CrawlDb
        crawldb1 ...    input CrawlDb-s
        -filter use URLFilters on urls in the crawldb(s)

So, in order to use it run a command like:

"nutch_home/bin/nutch mergedb output_crawldb your_crawldb -filter"

Before running you have to edit edit regex-urlfilter.txt, if you are using
urlfilter-regex plugin, adding the URL patterns you want to filter.

In order to check it works, dump your crawldb and the filtered one, using
"readdb" nutch command, and make a diff.

Hope it helps.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message