lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "John Griffin" <jgrif...@thebluezone.net>
Subject Re-tokenized fields disappear
Date Tue, 07 Oct 2008 02:39:28 GMT
Guys,

 

I have documents with multiple stored, tokenized fields of the same name but
different values in them such as:

 

"codesearch", "B01"

"codesearch", "B0105"

"codesearch", "Q01"

 

Etc;

 

I receive a new code to add to the document so I create a copy of the
document, call deleteFields("codesearch") on the copy, use the original to
add the original fields and the new one to the copy (horribly inefficient I
know but its inherited code and this needs fixed first) to build new fields.
I delete the old document from the index and add the new one.

 

The document was searchable before the rewrite on any of the codes but after
the rewrite none of the codes are searchable. Luke shows them as present in
the new document in the index. I reopen the reader and am using a
StandardAnalyzer on both the build and query.

 

Is there any thing else I have to do? What gives? Is there a problem with
multiple fields with the same name?

 

John G. 

 

 

 


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message