lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ian Lea" <ian....@gmail.com>
Subject Re: Incremental Indexing.
Date Tue, 09 Sep 2008 08:52:06 GMT
Such incremental indexing is standard practice and unlikely to cause a
problem, particularly if you are only working with a few thousand
documents.  Instead of delete/add you could use
IndexWriter.updateDocument().


--
Ian.


2008/9/9 장용석 <need4spd@gmail.com>:
> Hi~.
> I hava a question about lucene incremental indexing.
>
> I want to do incremental indexing my goods data.
> For example, I have 4 products datas with
> "GOOD_ID","NAME","PRICE","CREATEDATE","UPDATEDATE" colunms.
>
> 1, ipod, 30000, 2008-11-10:11:00, 2008-11-10:11:00
> 2, java book, 20000, 2008-11-10:11:00, 2008-11-10:11:00
> 3, calendar, 10000, 2008-11-10:11:00, 2008-11-10:11:00
> 4, lucene book, 5000, 2008-11-10:11:00, 2008-11-10:11:00
>
> If I will Indexing these datas, they will have a unique docid.
>
> And I update one of them that has good_id "1", price colunm 30000 to 35000
> and UPDATEDATE colunm 2008-11-10:11:00 to 2008-11-10:12:00.
>
> In this case , I want update my index with new data good_id "1".
>
> In book, If I want to update my index then I should delete target data from
> index and add data to index.
> If the target data is one, I think It is no matter for me, and applications.
> But if the target datas are over 3000 (or more) , this applcations must do
> job delete data and add data each 3000(or more) times.
> I worried about It will be problem to my applications.
> Or Is this job no matter?
>
> I need your helps.. :-)
>
> many thanks.
> Jang.
>
> --
> DEV용식
> http://devyongsik.tistory.com
>
Mime
View raw message