lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Naber <daniel.na...@t-online.de>
Subject Re: Why does the StandardTokenizer split hyphenated words?
Date Thu, 16 Dec 2004 19:03:27 GMT
On Thursday 16 December 2004 13:46, Mike Snare wrote:

> > Maybe for "a-b", but what about English words like "half-baked"?
>
> Perhaps that's the difference in thinking, then.  I would imagine that
> you would want to search on "half-baked" and not "half AND baked".

A search for half-baked will find both half-baked and "half baked" (the 
phrase). The only thing you'll not find if halfbaked.

Regards
 Daniel

-- 
http://www.danielnaber.de

---------------------------------------------------------------------
To unsubscribe, e-mail: lucene-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: lucene-user-help@jakarta.apache.org


Mime
View raw message