lucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From DM Smith <>
Subject Re: Token implementation
Date Sat, 12 Jul 2008 12:42:16 GMT
Michael McCandless wrote:
>> But, in TokenFilter, next() should be deprecated, IMHO.
> I think this is a good idea.  After all if people don't want to bother
> using the passed in Token, they are still allowed to return a new
> one.

I'm looking into the deprecation of I have come 
across a pattern that wants to know if there are future tokens:
Token t1 =;
Token t2 =;
if (t2 != null) {
  // there are multiple tokens
  ... setup for multiple token scenario ...
  ... consume t1, t2 and then loop ...
} else {
  // there is only one token
  ... setup for single token scenario ...
  ... consume t1 ...

I'm wondering as to the best way to handle this. An obvious solution 
come to mind: use clone()
Token token = new Token();
Token t1 =;
if (t1 != null) {
  t1 = t1.clone();

Token t2 =;
... same as before, but calling ...

Another is to use tokenStream.reset(), but I am not sure that all 
tokenStreams correctly implement it or that it is performant.

All the other solutions I can think of cause more grief to existing code.

-- DM

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message