lucene-solr-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steven Rowe (JIRA)" <>
Subject [jira] Updated: (SOLR-1670) synonymfilter/map repeat bug
Date Sun, 31 Jan 2010 08:07:34 GMT


Steven Rowe updated SOLR-1670:

    Attachment: SOLR-1670.patch

New version of the patch, introducing the possibility to test token streams that are allowed
to vary the order of their overlapping tokens.  All tests pass for me.

Yonik wrote:
bq. The description of the synonym filter never specifies the order of overlapping tokens,
thus the tests should accept either.

Yonik, can you check that the attached patch does what you are suggesting?

Robert wrote:
I guess in my opinion, overall its better for the tests to be overly strict, and if in the
future we make a valid change to the implementation that breaks a test, we can discuss it
during said change, and people can comment on whether this behavior was actually important:
for example the aa/a versus a/aa i would probably say not a big deal, but the aa versus aa/aa/aa
thing to me is a big deal.

The alternative, being overly lax, presents the possibility of introducing incorrect behavior
without being caught, and I think this is very dangerous.

Robert, do you think that the attached patch crosses over into overly lax land?

> synonymfilter/map repeat bug
> ----------------------------
>                 Key: SOLR-1670
>                 URL:
>             Project: Solr
>          Issue Type: Bug
>          Components: Schema and Analysis
>    Affects Versions: 1.4
>            Reporter: Robert Muir
>            Assignee: Yonik Seeley
>         Attachments: SOLR-1670.patch, SOLR-1670.patch, SOLR-1670_test.patch
> as part of converting tests for SOLR-1657, I ran into a problem with synonymfilter
> the test for 'repeats' has a flaw, it uses this assertTokEqual construct which does not
really validate that two lists of token are equal, it just stops at the shorted one.
> {code}
>     // repeats
>     map.add(strings("a b"), tokens("ab"), orig, merge);
>     map.add(strings("a b"), tokens("ab"), orig, merge);
>     assertTokEqual(getTokList(map,"a b",false), tokens("ab"));
>     /* in reality the result from getTokList is ab ab ab!!!!! */
> {code}
> when converted to assertTokenStreamContents this problem surfaced. attached is an additional
assertion to the existing testcase.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message