Return-Path: X-Original-To: apmail-lucene-dev-archive@www.apache.org Delivered-To: apmail-lucene-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 3D2CA971E for ; Tue, 10 Apr 2012 19:20:11 +0000 (UTC) Received: (qmail 48530 invoked by uid 500); 10 Apr 2012 19:20:10 -0000 Delivered-To: apmail-lucene-dev-archive@lucene.apache.org Received: (qmail 48480 invoked by uid 500); 10 Apr 2012 19:20:10 -0000 Mailing-List: contact dev-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@lucene.apache.org Delivered-To: mailing list dev@lucene.apache.org Received: (qmail 48473 invoked by uid 99); 10 Apr 2012 19:20:09 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 10 Apr 2012 19:20:09 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of dawid.weiss@gmail.com designates 209.85.161.176 as permitted sender) Received: from [209.85.161.176] (HELO mail-gx0-f176.google.com) (209.85.161.176) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 10 Apr 2012 19:20:03 +0000 Received: by ggcs6 with SMTP id s6so145477ggc.35 for ; Tue, 10 Apr 2012 12:19:42 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:sender:in-reply-to:references:from:date :x-google-sender-auth:message-id:subject:to:content-type :content-transfer-encoding; bh=IRaokplf55hBmeZQMY/R9IzSS2a3PXePMcnd+Qc+6W0=; b=ORTaZ8+nwXJApSZhbSPS36fCgiQ+gRrCK3rLQEznK23/0zzMHwKJdTidiEQu/0boeo VqRMX9Vm45IvI6YDe7uoFqunzUb2V8TJOL6RRmnKYyZpr08FkB+HMY/eKhk6DzNWOkZL zE5oTVQ6JuQj6c0KZCd1VHobGSVVjDqDVbyoKJklCKtHCDiltGePb0dPSLUkk7RO/Nvu U69AMob80SB32f/qC3UfRSLoLboptVseOYM3gHPnYjOgmEBcmKM9HBB6j+TsQ1faQkY4 L7I747azBlhPGkc6nROOxBe8HkH2zaJ+4ymBayabfIG4/vvp/MivvBmxJWUoaPWdbKj/ vBNg== Received: by 10.50.77.201 with SMTP id u9mr3495307igw.21.1334085581898; Tue, 10 Apr 2012 12:19:41 -0700 (PDT) MIME-Version: 1.0 Sender: dawid.weiss@gmail.com Received: by 10.42.117.199 with HTTP; Tue, 10 Apr 2012 12:19:20 -0700 (PDT) In-Reply-To: <001c01cd174d$3b9f7900$b2de6b00$@thetaphi.de> References: <20120410185054.F0CEF2388962@eris.apache.org> <001c01cd174d$3b9f7900$b2de6b00$@thetaphi.de> From: Dawid Weiss Date: Tue, 10 Apr 2012 21:19:20 +0200 X-Google-Sender-Auth: 0Cs2949PF4UPnZ6tKoO2rvGAUwY Message-ID: Subject: Re: svn commit: r1311920 - /lucene/dev/branches/lucene3969/modules/analysis/common/src/test/org/apache/lucene/analysis/core/TestRandomChains.java To: dev@lucene.apache.org Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable That's because emacs (and vim) indent blocks of code you haven't thought about yet, but will think about in 5 minutes from a given time... Dawid On Tue, Apr 10, 2012 at 9:07 PM, Uwe Schindler wrote: > No problem, > > I mainly readded the missing newlines between methods. > > The other indenting was not so important, but it took too much space to t= he > right. Why does Emacs change the indenting of unrelated code? My favorite > Notepad++ (or Eclipse if I also do refactoring) only does this on the blo= ck > you are working on! It seems your Emacs sometimes changes the whole file > formatting? > > Uwe > > ----- > Uwe Schindler > H.-H.-Meier-Allee 63, D-28213 Bremen > http://www.thetaphi.de > eMail: uwe@thetaphi.de > >> -----Original Message----- >> From: Michael McCandless [mailto:lucene@mikemccandless.com] >> Sent: Tuesday, April 10, 2012 9:01 PM >> To: dev@lucene.apache.org >> Subject: Re: svn commit: r1311920 - >> /lucene/dev/branches/lucene3969/modules/analysis/common/src/test/org/apa >> che/lucene/analysis/core/TestRandomChains.java >> >> Sorry Uwe :) >> >> I guess Emacs indents differently from Eclipse! >> >> Mike McCandless >> >> http://blog.mikemccandless.com >> >> On Tue, Apr 10, 2012 at 2:50 PM, =C2=A0 wrote: >> > Author: uschindler >> > Date: Tue Apr 10 18:50:54 2012 >> > New Revision: 1311920 >> > >> > URL: http://svn.apache.org/viewvc?rev=3D1311920&view=3Drev >> > Log: >> > LUCENE-3969: revert Whitespace >> > >> > Modified: >> > >> > lucene/dev/branches/lucene3969/modules/analysis/common/src/test/org/ap >> > ache/lucene/analysis/core/TestRandomChains.java >> > >> > Modified: >> > lucene/dev/branches/lucene3969/modules/analysis/common/src/test/org/ap >> > ache/lucene/analysis/core/TestRandomChains.java >> > URL: >> > http://svn.apache.org/viewvc/lucene/dev/branches/lucene3969/modules/an >> > alysis/common/src/test/org/apache/lucene/analysis/core/TestRandomChain >> > s.java?rev=3D1311920&r1=3D1311919&r2=3D1311920&view=3Ddiff >> > >> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D >> =3D=3D=3D=3D=3D=3D >> > =3D=3D=3D=3D=3D=3D=3D=3D >> > --- >> > lucene/dev/branches/lucene3969/modules/analysis/common/src/test/org/ap >> > ache/lucene/analysis/core/TestRandomChains.java (original) >> > +++ lucene/dev/branches/lucene3969/modules/analysis/common/src/test/or >> > +++ g/apache/lucene/analysis/core/TestRandomChains.java Tue Apr 10 >> > +++ 18:50:54 2012 >> > @@ -105,30 +105,30 @@ public class TestRandomChains extends Ba >> > =C2=A0 =C2=A0 // nocommit can we promote some of these to be only >> > =C2=A0 =C2=A0 // offsets offenders? >> > =C2=A0 =C2=A0 Collections.>addAll(brokenComponents, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // TODO: fix basetokenstreamt= estcase >> > not to trip because this one has no CharTermAtt >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 EmptyTokenizer.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // doesn't actual reset itsel= f! >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 CachingTokenFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // doesn't consume whole stre= am! >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 LimitTokenCountFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // Not broken: we forcefully = add >> > this, so we shouldn't >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // also randomly pick it: >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ValidatingTokenFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // NOTE: these by themselves = won't > cause any 'basic >> assertions' to fail. >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // but see >> > https://issues.apache.org/jira/browse/LUCENE-3920, if any >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // tokenfilter that combines = words >> > (e.g. shingles) comes after them, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // this will create bogus off= sets >> > because their 'offsets go backwards', >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // causing shingle or whateve= r to >> > make a single token with a >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // startOffset thats > its en= dOffset >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // (see LUCENE-3738 for a lis= t of >> > other offenders here) >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // broken! >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 NGramTokenizer.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // broken! >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 NGramTokenFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // broken! >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 EdgeNGramTokenizer.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // broken! >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 EdgeNGramTokenFilter.class >> > + =C2=A0 =C2=A0 =C2=A0// TODO: fix basetokenstreamtestcase not to trip= because this >> > + one has no CharTermAtt >> > + =C2=A0 =C2=A0 =C2=A0EmptyTokenizer.class, >> > + =C2=A0 =C2=A0 =C2=A0// doesn't actual reset itself! >> > + =C2=A0 =C2=A0 =C2=A0CachingTokenFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0// doesn't consume whole stream! >> > + =C2=A0 =C2=A0 =C2=A0LimitTokenCountFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0// Not broken: we forcefully add this, so we sho= uldn't >> > + =C2=A0 =C2=A0 =C2=A0// also randomly pick it: >> > + =C2=A0 =C2=A0 =C2=A0ValidatingTokenFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0// NOTE: these by themselves won't cause any 'ba= sic assertions' > to fail. >> > + =C2=A0 =C2=A0 =C2=A0// but see https://issues.apache.org/jira/browse= /LUCENE-3920, >> > + if any >> > + =C2=A0 =C2=A0 =C2=A0// tokenfilter that combines words (e.g. shingle= s) comes after >> > + them, >> > + =C2=A0 =C2=A0 =C2=A0// this will create bogus offsets because their = 'offsets go >> > + backwards', >> > + =C2=A0 =C2=A0 =C2=A0// causing shingle or whatever to make a single = token with a >> > + =C2=A0 =C2=A0 =C2=A0// startOffset thats > its endOffset >> > + =C2=A0 =C2=A0 =C2=A0// (see LUCENE-3738 for a list of other offender= s here) >> > + =C2=A0 =C2=A0 =C2=A0// broken! >> > + =C2=A0 =C2=A0 =C2=A0NGramTokenizer.class, >> > + =C2=A0 =C2=A0 =C2=A0// broken! >> > + =C2=A0 =C2=A0 =C2=A0NGramTokenFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0// broken! >> > + =C2=A0 =C2=A0 =C2=A0EdgeNGramTokenizer.class, >> > + =C2=A0 =C2=A0 =C2=A0// broken! >> > + =C2=A0 =C2=A0 =C2=A0EdgeNGramTokenFilter.class >> > =C2=A0 =C2=A0 ); >> > =C2=A0 } >> > >> > @@ -137,18 +137,19 @@ public class TestRandomChains extends Ba >> > =C2=A0 private static final Set> brokenOffsetsComponents =3D >> > Collections.newSetFromMap(new IdentityHashMap,Boolean>()); >> > =C2=A0 static { >> > =C2=A0 =C2=A0 Collections.>addAll(brokenOffsetsComponents, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 WordDelimiterFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 TrimFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ReversePathHierarchyTokenizer= .class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 PathHierarchyTokenizer.class, >> > - >> > HyphenationCompoundWordTokenFilter.class, >> > - >> > DictionaryCompoundWordTokenFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // nocommit: corrumpts graphs= (offset > consistency >> check): >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 PositionFilter.class, >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 // nocommit it seems to mess = up > offsets!? >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 WikipediaTokenizer.class >> > - =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 ); >> > + =C2=A0 =C2=A0 =C2=A0WordDelimiterFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0TrimFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0ReversePathHierarchyTokenizer.class, >> > + =C2=A0 =C2=A0 =C2=A0PathHierarchyTokenizer.class, >> > + =C2=A0 =C2=A0 =C2=A0HyphenationCompoundWordTokenFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0DictionaryCompoundWordTokenFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0// nocommit: corrumpts graphs (offset consistenc= y check): >> > + =C2=A0 =C2=A0 =C2=A0PositionFilter.class, >> > + =C2=A0 =C2=A0 =C2=A0// nocommit it seems to mess up offsets!? >> > + =C2=A0 =C2=A0 =C2=A0WikipediaTokenizer.class >> > + =C2=A0 =C2=A0); >> > =C2=A0 } >> > + >> > =C2=A0 @BeforeClass >> > =C2=A0 public static void beforeClass() throws Exception { >> > =C2=A0 =C2=A0 List> analysisClasses =3D new ArrayList>(); @@ >> > -168,6 +169,7 @@ public class TestRandomChains extends Ba >> > =C2=A0 =C2=A0 =C2=A0 ) { >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 continue; >> > =C2=A0 =C2=A0 =C2=A0 } >> > + >> > =C2=A0 =C2=A0 =C2=A0 for (final Constructor ctor : c.getConstructor= s()) { >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 // don't test synthetic or deprecated ctor= s, they likely have > known bugs: >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 if (ctor.isSynthetic() || >> > ctor.isAnnotationPresent(Deprecated.class)) { @@ -175,21 +177,22 @@ >> > public class TestRandomChains extends Ba >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 } >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 if (Tokenizer.class.isAssignableFrom(c)) { >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 assertTrue(ctor.toGenericString() += " has unsupported >> > parameter types", >> > - >> > allowedTokenizerArgs.containsAll(Arrays.asList(ctor.getParameterTypes( >> > )))); >> > + >> > + allowedTokenizerArgs.containsAll(Arrays.asList(ctor.getParameterType >> > + s()))); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 tokenizers.add(castConstructor(Toke= nizer.class, ctor)); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 } else if (TokenFilter.class.isAssignableF= rom(c)) { >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 assertTrue(ctor.toGenericString() += " has unsupported >> > parameter types", >> > - >> > allowedTokenFilterArgs.containsAll(Arrays.asList(ctor.getParameterType >> > s()))); >> > + >> > + allowedTokenFilterArgs.containsAll(Arrays.asList(ctor.getParameterTy >> > + pes()))); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 tokenfilters.add(castConstructor(To= kenFilter.class, ctor)); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 } else if (CharStream.class.isAssignableFr= om(c)) { >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 assertTrue(ctor.toGenericString() += " has unsupported >> > parameter types", >> > - >> > allowedCharFilterArgs.containsAll(Arrays.asList(ctor.getParameterTypes >> > ()))); >> > + >> > + allowedCharFilterArgs.containsAll(Arrays.asList(ctor.getParameterTyp >> > + es()))); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 charfilters.add(castConstructor(Cha= rStream.class, ctor)); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 } else { >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 fail("Cannot get here"); >> > =C2=A0 =C2=A0 =C2=A0 =C2=A0 } >> > =C2=A0 =C2=A0 =C2=A0 } >> > =C2=A0 =C2=A0 } >> > + >> > =C2=A0 =C2=A0 final Comparator> ctorComp =3D new >> > Comparator>() { >> > =C2=A0 =C2=A0 =C2=A0 @Override >> > =C2=A0 =C2=A0 =C2=A0 public int compare(Constructor arg0, Construct= or arg1) { >> > @@ -205,12 +208,14 @@ public class TestRandomChains extends Ba >> > =C2=A0 =C2=A0 =C2=A0 System.out.println("charfilters =3D " + charfilte= rs); >> > =C2=A0 =C2=A0 } >> > =C2=A0 } >> > + >> > =C2=A0 @AfterClass >> > =C2=A0 public static void afterClass() throws Exception { >> > =C2=A0 =C2=A0 tokenizers =3D null; >> > =C2=A0 =C2=A0 tokenfilters =3D null; >> > =C2=A0 =C2=A0 charfilters =3D null; >> > =C2=A0 } >> > + >> > =C2=A0 /** Hack to work around the stupidness of Oracle's strict Java > backwards >> compatibility. >> > =C2=A0 =C2=A0* {@code Class#getConstructors()} should return unmodi= fiable >> > {@code List>} not array! */ >> > =C2=A0 @SuppressWarnings("unchecked") >> > >> > >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org For additional >> commands, e-mail: dev-help@lucene.apache.org > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org > For additional commands, e-mail: dev-help@lucene.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org For additional commands, e-mail: dev-help@lucene.apache.org