lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Naber <>
Subject Re: Why does the StandardTokenizer split hyphenated words?
Date Thu, 16 Dec 2004 19:03:27 GMT
On Thursday 16 December 2004 13:46, Mike Snare wrote:

> > Maybe for "a-b", but what about English words like "half-baked"?
> Perhaps that's the difference in thinking, then.  I would imagine that
> you would want to search on "half-baked" and not "half AND baked".

A search for half-baked will find both half-baked and "half baked" (the 
phrase). The only thing you'll not find if halfbaked.



To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message