lucy-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marvin Humphrey <>
Subject Re: [lucy-dev] Optimization Bug On Mac OS X
Date Wed, 21 Mar 2012 23:15:34 GMT
On Mon, Mar 19, 2012 at 10:43 PM, Logan Bell <> wrote:
> To summarize, I was getting the following unit test failure when after
> compiling Lucy with Perl 5.14.2 (perl brew install):

The fact that this occurred only under perlbrew is notable because while the
system Perl on Lion appears to have been compiled with GCC, the default
compiler "cc" has been switched to Clang and thus perlbrew Perls are compiled
with Clang.  We appear to have encountered an optimization bug in Clang.

> not ok 2 - catch overflow in token position calculation
> #   Failed test 'catch overflow in token position calculation'
> #   at t/152-inversion.t line 70.
> #                   ''
> #     doesn't match '(?^:position)'

The test is designed to cover this section of code from

    // Assign token positions.
    for (; tokens < limit; tokens++) {
        Token *const cur_token = *tokens;
        cur_token->pos = token_pos;
        token_pos += cur_token->pos_inc;
        if (token_pos < cur_token->pos) {
            THROW(ERR, "Token positions out of order: %i32 %i32",
                  cur_token->pos, token_pos);

It is trying to trigger the exception, but failing on this particular Perl.

> After dropping in various debug statements the unit test would magically
> begin to pass.

If we insert the following debugging WARN statement immediately *above* the
"if" clause, things start to work as intended.

    WARN("token_pos: %i32 cur_token->pos: %i32", token_pos, cur_token->pos);

If we insert the WARN statement directly *below* the "if" test, the problem
continues to manifest, but the WARN statement reveals values for the
variables which clearly should have triggered the exception.

> At this point I turned off optimization and the test would
> always pass. Based on this evidence it would be safe to surmise that the
> optimizer is making a mistake somewhere. After further discussions on IRC
> with Marvin, we speculated that 32 bit integers that are being accumulated
> into 64 bit registers are *not* being truncated back to 32 bit prior to a
> comparison operation.

It is the behavior when the WARN statement is inserted which spurs this

Without the WARN statement, the last thing which happens before the comparison
operation is the addition to token_pos.  The compiler can leave that value in
a register and then compare against it.

When the WARN statement is inserted, the register where token_pos had been
accumulated will be needed for other things, so the value of token_pos must be
moved out of the register to temporary storage, then restored so that the
comparison can take place.

The hypothesis is:

  * The sum is being accumulated into a register that is larger than 32 bits.
  * An extra op is necessary to determine that 32-bit overflow has occurred.
  * This op has been inappriately optimized away by the compiler.

Moving the value out of the register and then restoring it has the side
effect of truncating it.  For what it's worth, we worked around an "excess
floating point precision" problem using a similar brute force truncation in

                    if (*(int32_t*)&score == *(int32_t*)&self->bubble_score) {

> One idea that was floated on IRC was to convert the associated integers in
> question to int64_t, since support for 64 bit integers is required anyways
> for Lucy.

If we go 64-bit, we still need to guard against overflow.  The only advantage
is that it might work around this apparent optimizer bug without us needing to
do stupid stuff like probe for the bug and #ifdef the problematic section.

Marvin Humphrey

View raw message