Return-Path: X-Original-To: apmail-lucene-dev-archive@www.apache.org Delivered-To: apmail-lucene-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 54413C241 for ; Sat, 10 Jan 2015 20:33:11 +0000 (UTC) Received: (qmail 89615 invoked by uid 500); 10 Jan 2015 20:33:11 -0000 Delivered-To: apmail-lucene-dev-archive@lucene.apache.org Received: (qmail 89545 invoked by uid 500); 10 Jan 2015 20:33:11 -0000 Mailing-List: contact dev-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@lucene.apache.org Delivered-To: mailing list dev@lucene.apache.org Received: (qmail 89529 invoked by uid 99); 10 Jan 2015 20:33:10 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 10 Jan 2015 20:33:10 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of rcmuir@gmail.com designates 209.85.218.53 as permitted sender) Received: from [209.85.218.53] (HELO mail-oi0-f53.google.com) (209.85.218.53) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 10 Jan 2015 20:32:36 +0000 Received: by mail-oi0-f53.google.com with SMTP id g201so16172155oib.12 for ; Sat, 10 Jan 2015 12:31:50 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type:content-transfer-encoding; bh=CGT6rkGjRCU+aL7djErF6JUqY8puGH68VzurMDKl+x4=; b=N+RMhNnBnLysnQO1KNNOBUM6WMTq2dJW6p6Fdvh7I6+6AptnIizJjhQjklLAHUpqpk 5EKNDhZ9lStLRyyxQ2zaFUQ0r+ryzzdVFjDVg0s/PxkwJ4ZYQ7mHOqNz4CNAuIUE9KxU YjmfkFm/E0B6Dl/XpkCHwAy010L4bJkB8k2Gt+X6N3nrVliS0PJ20gKPJuoVv5viu00n wJFpoCwTzUbN6hIA7J7nJYNtagt659BzE8ZIJkhPuc/gjwoR9LQPWLmw4MSeRajhJ7Ke 9N/VFXprxlWjYFGEIVvBfAFljo2eRYT36J3cmGxrKAs+KC+1YJf2AAqT+xiR86WA2arl VAVw== X-Received: by 10.182.231.230 with SMTP id tj6mr13460407obc.58.1420921909975; Sat, 10 Jan 2015 12:31:49 -0800 (PST) MIME-Version: 1.0 Received: by 10.202.95.135 with HTTP; Sat, 10 Jan 2015 12:31:29 -0800 (PST) In-Reply-To: <1374923939.582.1420901058249.JavaMail.jenkins@crius> References: <2068034603.421.1420821362634.JavaMail.jenkins@crius> <1374923939.582.1420901058249.JavaMail.jenkins@crius> From: Robert Muir Date: Sat, 10 Jan 2015 15:31:29 -0500 Message-ID: Subject: Re: [JENKINS] Lucene-Solr-NightlyTests-5.x - Build # 730 - Still Failing To: "dev@lucene.apache.org" Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Virus-Checked: Checked by ClamAV on apache.org I committed a fix. On Sat, Jan 10, 2015 at 9:44 AM, Apache Jenkins Server wrote: > Build: https://builds.apache.org/job/Lucene-Solr-NightlyTests-5.x/730/ > > 1 tests failed. > REGRESSION: org.apache.lucene.index.TestDemoParallelLeafReader.testRando= mMultipleSchemaGensSameField > > Error Message: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lu= cene/build/core/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910B= B799798171-001/tempDir-002/index/_az_TestBloomFilteredLucenePostings_0.tim:= Too many open files > > Stack Trace: > java.nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/worksp= ace/Lucene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.inde= x.TestDemoParallelLeafReader 78910BB799798171-001/tempDir-002/index/_az_Tes= tBloomFilteredLucenePostings_0.tim: Too many open files > at org.apache.lucene.mockfile.HandleLimitFS.onOpen(HandleLimitFS.= java:49) > at org.apache.lucene.mockfile.HandleTrackingFS.callOpenHook(Handl= eTrackingFS.java:84) > at org.apache.lucene.mockfile.HandleTrackingFS.newOutputStream(Ha= ndleTrackingFS.java:157) > at java.nio.file.Files.newOutputStream(Files.java:172) > at org.apache.lucene.store.FSDirectory$FSIndexOutput.(FSDir= ectory.java:265) > at org.apache.lucene.store.FSDirectory.createOutput(FSDirectory.j= ava:214) > at org.apache.lucene.store.MockDirectoryWrapper.createOutput(Mock= DirectoryWrapper.java:607) > at org.apache.lucene.store.TrackingDirectoryWrapper.createOutput(= TrackingDirectoryWrapper.java:43) > at org.apache.lucene.codecs.blocktree.BlockTreeTermsWriter.= (BlockTreeTermsWriter.java:278) > at org.apache.lucene.codecs.lucene50.Lucene50PostingsFormat.field= sConsumer(Lucene50PostingsFormat.java:433) > at org.apache.lucene.codecs.bloom.BloomFilteringPostingsFormat.fi= eldsConsumer(BloomFilteringPostingsFormat.java:147) > at org.apache.lucene.codecs.bloom.TestBloomFilteredLucenePostings= .fieldsConsumer(TestBloomFilteredLucenePostings.java:66) > at org.apache.lucene.codecs.perfield.PerFieldPostingsFormat$Field= sWriter.write(PerFieldPostingsFormat.java:196) > at org.apache.lucene.index.FreqProxTermsWriter.flush(FreqProxTerm= sWriter.java:107) > at org.apache.lucene.index.DefaultIndexingChain.flush(DefaultInde= xingChain.java:112) > at org.apache.lucene.index.DocumentsWriterPerThread.flush(Documen= tsWriterPerThread.java:419) > at org.apache.lucene.index.DocumentsWriter.doFlush(DocumentsWrite= r.java:503) > at org.apache.lucene.index.DocumentsWriter.postUpdate(DocumentsWr= iter.java:373) > at org.apache.lucene.index.DocumentsWriter.updateDocument(Documen= tsWriter.java:466) > at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter= .java:1415) > at org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.ja= va:1150) > at org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.ja= va:1135) > at org.apache.lucene.index.TestDemoParallelLeafReader.testRandomM= ultipleSchemaGensSameField(TestDemoParallelLeafReader.java:1076) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccess= orImpl.java:57) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMeth= odAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(Ran= domizedRunner.java:1618) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate= (RandomizedRunner.java:827) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate= (RandomizedRunner.java:863) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate= (RandomizedRunner.java:877) > at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate= (TestRuleSetupTeardownChained.java:50) > at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(Abst= ractBeforeAfterRule.java:46) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvar= iantRule$1.evaluate(SystemPropertiesInvariantRule.java:55) > at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(Te= stRuleThreadAndTestName.java:49) > at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evalua= te(TestRuleIgnoreAfterMaxFailures.java:65) > at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRule= MarkFailure.java:48) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.eval= uate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$Statement= Runner.run(ThreadLeakControl.java:365) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeo= utingTask(ThreadLeakControl.java:798) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluat= e(ThreadLeakControl.java:458) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleT= est(RandomizedRunner.java:836) > at com.carrotsearch.randomizedtesting.RandomizedRunner$3.evaluate= (RandomizedRunner.java:738) > at com.carrotsearch.randomizedtesting.RandomizedRunner$4.evaluate= (RandomizedRunner.java:772) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate= (RandomizedRunner.java:783) > at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(Abst= ractBeforeAfterRule.java:46) > at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestR= uleStoreClassName.java:42) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesInvar= iantRule$1.evaluate(SystemPropertiesInvariantRule.java:55) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverride= sOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverride= sOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:39) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.eval= uate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.eval= uate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.eval= uate(StatementAdapter.java:36) > at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(T= estRuleAssertionsRequired.java:54) > at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRule= MarkFailure.java:48) > at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evalua= te(TestRuleIgnoreAfterMaxFailures.java:65) > at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(Tes= tRuleIgnoreTestSuites.java:55) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.eval= uate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$Statement= Runner.run(ThreadLeakControl.java:365) > at java.lang.Thread.run(Thread.java:745) > > > > > Build Log: > [...truncated 1862 lines...] > [junit4] Suite: org.apache.lucene.index.TestDemoParallelLeafReader > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #1,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpyvp_40/_0_Memory_0.mdvd: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpyvp_40/_0_Memory_0.mdvd: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:76) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1678) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1658) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:851) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #12,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz6a_43/_0.fdt: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz6a_43/_0.fdt: Too many open file= s > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsWriter.(CompressingStoredFieldsWriter.java:105) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsFormat.fieldsWriter(CompressingStoredFieldsFormat.java:128) > [junit4] 2> at org.apache.lucene.codecs.lucene50.Lucene50Stor= edFieldsFormat.fieldsWriter(Lucene50StoredFieldsFormat.java:179) > [junit4] 2> at org.apache.lucene.codecs.asserting.AssertingSt= oredFieldsFormat.fieldsWriter(AssertingStoredFieldsFormat.java:49) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.i= nitStoredFieldsWriter(DefaultIndexingChain.java:81) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.s= tartStoredFields(DefaultIndexingChain.java:258) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.p= rocessDocument(DefaultIndexingChain.java:295) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.updateDocument(DocumentsWriterPerThread.java:231) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.update= Document(DocumentsWriter.java:449) > [junit4] 2> at org.apache.lucene.index.IndexWriter.updateDocu= ment(IndexWriter.java:1415) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1150) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1135) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:835) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #6,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpyzj_43/_0_Memory_0.mdvd: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpyzj_43/_0_Memory_0.mdvd: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:76) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1678) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1658) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:851) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #16,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpzbm_45/_0_Memory_0.mdvm: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpzbm_45/_0_Memory_0.mdvm: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:79) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.shutdown(I= ndexWriter.java:963) > [junit4] 2> at org.apache.lucene.index.IndexWriter.close(Inde= xWriter.java:1008) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:854) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #14,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz93_44/_0.fdt: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz93_44/_0.fdt: Too many open file= s > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsWriter.(CompressingStoredFieldsWriter.java:105) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsFormat.fieldsWriter(CompressingStoredFieldsFormat.java:128) > [junit4] 2> at org.apache.lucene.codecs.lucene50.Lucene50Stor= edFieldsFormat.fieldsWriter(Lucene50StoredFieldsFormat.java:179) > [junit4] 2> at org.apache.lucene.codecs.asserting.AssertingSt= oredFieldsFormat.fieldsWriter(AssertingStoredFieldsFormat.java:49) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.i= nitStoredFieldsWriter(DefaultIndexingChain.java:81) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.s= tartStoredFields(DefaultIndexingChain.java:258) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.p= rocessDocument(DefaultIndexingChain.java:295) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.updateDocument(DocumentsWriterPerThread.java:231) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.update= Document(DocumentsWriter.java:449) > [junit4] 2> at org.apache.lucene.index.IndexWriter.updateDocu= ment(IndexWriter.java:1415) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1150) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1135) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:835) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #4,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpyxe_41/_0_Memory_0.mdvm: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpyxe_41/_0_Memory_0.mdvm: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:79) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.shutdown(I= ndexWriter.java:963) > [junit4] 2> at org.apache.lucene.index.IndexWriter.close(Inde= xWriter.java:1008) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:854) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #2,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpyw4_40/_0.fdt: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpyw4_40/_0.fdt: Too many open file= s > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsWriter.(CompressingStoredFieldsWriter.java:105) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsFormat.fieldsWriter(CompressingStoredFieldsFormat.java:128) > [junit4] 2> at org.apache.lucene.codecs.lucene50.Lucene50Stor= edFieldsFormat.fieldsWriter(Lucene50StoredFieldsFormat.java:179) > [junit4] 2> at org.apache.lucene.codecs.asserting.AssertingSt= oredFieldsFormat.fieldsWriter(AssertingStoredFieldsFormat.java:49) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.i= nitStoredFieldsWriter(DefaultIndexingChain.java:81) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.s= tartStoredFields(DefaultIndexingChain.java:258) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.p= rocessDocument(DefaultIndexingChain.java:295) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.updateDocument(DocumentsWriterPerThread.java:231) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.update= Document(DocumentsWriter.java:449) > [junit4] 2> at org.apache.lucene.index.IndexWriter.updateDocu= ment(IndexWriter.java:1415) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1150) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1135) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:835) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #5,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpyz5_42/_0_Memory_0.mdvm: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpyz5_42/_0_Memory_0.mdvm: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:79) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1678) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1658) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:851) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #7,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpyww_43/_0.fdt: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpyww_43/_0.fdt: Too many open file= s > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsWriter.(CompressingStoredFieldsWriter.java:105) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsFormat.fieldsWriter(CompressingStoredFieldsFormat.java:128) > [junit4] 2> at org.apache.lucene.codecs.lucene50.Lucene50Stor= edFieldsFormat.fieldsWriter(Lucene50StoredFieldsFormat.java:179) > [junit4] 2> at org.apache.lucene.codecs.asserting.AssertingSt= oredFieldsFormat.fieldsWriter(AssertingStoredFieldsFormat.java:49) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.i= nitStoredFieldsWriter(DefaultIndexingChain.java:81) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.s= tartStoredFields(DefaultIndexingChain.java:258) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.p= rocessDocument(DefaultIndexingChain.java:295) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.updateDocument(DocumentsWriterPerThread.java:231) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.update= Document(DocumentsWriter.java:449) > [junit4] 2> at org.apache.lucene.index.IndexWriter.updateDocu= ment(IndexWriter.java:1415) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1150) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1135) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:835) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #20,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpzjj_45/_0.fdt: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpzjj_45/_0.fdt: Too many open file= s > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsWriter.(CompressingStoredFieldsWriter.java:105) > [junit4] 2> at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsFormat.fieldsWriter(CompressingStoredFieldsFormat.java:128) > [junit4] 2> at org.apache.lucene.codecs.lucene50.Lucene50Stor= edFieldsFormat.fieldsWriter(Lucene50StoredFieldsFormat.java:179) > [junit4] 2> at org.apache.lucene.codecs.asserting.AssertingSt= oredFieldsFormat.fieldsWriter(AssertingStoredFieldsFormat.java:49) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.i= nitStoredFieldsWriter(DefaultIndexingChain.java:81) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.s= tartStoredFields(DefaultIndexingChain.java:258) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.p= rocessDocument(DefaultIndexingChain.java:295) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.updateDocument(DocumentsWriterPerThread.java:231) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.update= Document(DocumentsWriter.java:449) > [junit4] 2> at org.apache.lucene.index.IndexWriter.updateDocu= ment(IndexWriter.java:1415) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1150) > [junit4] 2> at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1135) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:835) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #21,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpzll_45: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpzll_45: Too many open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wDirectoryStream(HandleTrackingFS.java:298) > [junit4] 2> at java.nio.file.Files.newDirectoryStream(Files.j= ava:545) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:171) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:191) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.l= istAll(MockDirectoryWrapper.java:1000) > [junit4] 2> at org.apache.lucene.index.DirectoryReader.indexE= xists(DirectoryReader.java:288) > [junit4] 2> at org.apache.lucene.index.IndexWriter.(Ind= exWriter.java:787) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:823) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #19,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpzge_45: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpzge_45: Too many open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wDirectoryStream(HandleTrackingFS.java:298) > [junit4] 2> at java.nio.file.Files.newDirectoryStream(Files.j= ava:545) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:171) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:191) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.l= istAll(MockDirectoryWrapper.java:1000) > [junit4] 2> at org.apache.lucene.index.DirectoryReader.indexE= xists(DirectoryReader.java:288) > [junit4] 2> at org.apache.lucene.index.IndexWriter.(Ind= exWriter.java:787) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:823) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #0,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpwp6_40/_0_Memory_0.mdvd: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpwp6_40/_0_Memory_0.mdvd: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:76) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1678) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1658) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:851) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #13,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz70_43: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz70_43: Too many open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wDirectoryStream(HandleTrackingFS.java:298) > [junit4] 2> at java.nio.file.Files.newDirectoryStream(Files.j= ava:545) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:171) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:191) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.l= istAll(MockDirectoryWrapper.java:1000) > [junit4] 2> at org.apache.lucene.index.DirectoryReader.indexE= xists(DirectoryReader.java:288) > [junit4] 2> at org.apache.lucene.index.IndexWriter.(Ind= exWriter.java:787) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:823) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #8,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz1o_43: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz1o_43: Too many open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wDirectoryStream(HandleTrackingFS.java:298) > [junit4] 2> at java.nio.file.Files.newDirectoryStream(Files.j= ava:545) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:171) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:191) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.l= istAll(MockDirectoryWrapper.java:1000) > [junit4] 2> at org.apache.lucene.index.DirectoryReader.indexE= xists(DirectoryReader.java:288) > [junit4] 2> at org.apache.lucene.index.IndexWriter.(Ind= exWriter.java:787) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:823) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #15,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz9t_45/_0_Memory_0.mdvd: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz9t_45/_0_Memory_0.mdvd: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:76) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.shutdown(I= ndexWriter.java:963) > [junit4] 2> at org.apache.lucene.index.IndexWriter.close(Inde= xWriter.java:1008) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:854) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #3,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpywn_41: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpywn_41: Too many open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wDirectoryStream(HandleTrackingFS.java:298) > [junit4] 2> at java.nio.file.Files.newDirectoryStream(Files.j= ava:545) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:171) > [junit4] 2> at org.apache.lucene.store.FSDirectory.listAll(FS= Directory.java:191) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.l= istAll(MockDirectoryWrapper.java:1000) > [junit4] 2> at org.apache.lucene.index.DirectoryReader.indexE= xists(DirectoryReader.java:288) > [junit4] 2> at org.apache.lucene.index.IndexWriter.(Ind= exWriter.java:787) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:823) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #11,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz52_43/_0_Memory_0.mdvd: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz52_43/_0_Memory_0.mdvd: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:76) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1678) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1658) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:851) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #10,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpz3l_43/_0_Memory_0.mdvm: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpz3l_43/_0_Memory_0.mdvm: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:79) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1678) > [junit4] 2> at org.apache.lucene.index.IndexWriter.forceMerge= (IndexWriter.java:1658) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:851) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:15 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception in thread: Thread[Lucene Mer= ge Thread #18,5,TGRP-TestDemoParallelLeafReader] > [junit4] 2> org.apache.lucene.index.MergePolicy$MergeException: java= .nio.file.FileSystemException: /usr/home/jenkins/jenkins-slave/workspace/Lu= cene-Solr-NightlyTests-5.x/lucene/build/core/test/J3/temp/lucene.index.Test= DemoParallelLeafReader 78910BB799798171-001/tempDir-002/segs/6za9dzr2yvnqea= arom3jvpzem_45/_0_Memory_0.mdvm: Too many open files > [junit4] 2> at __randomizedtesting.SeedInfo.seed([78910BB7997= 98171]:0) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.handleMergeException(ConcurrentMergeScheduler.java:611) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:581) > [junit4] 2> Caused by: java.nio.file.FileSystemException: /usr/home/= jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/c= ore/test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-0= 01/tempDir-002/segs/6za9dzr2yvnqeaarom3jvpzem_45/_0_Memory_0.mdvm: Too many= open files > [junit4] 2> at org.apache.lucene.mockfile.HandleLimitFS.onOpe= n(HandleLimitFS.java:49) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ca= llOpenHook(HandleTrackingFS.java:84) > [junit4] 2> at org.apache.lucene.mockfile.HandleTrackingFS.ne= wOutputStream(HandleTrackingFS.java:157) > [junit4] 2> at java.nio.file.Files.newOutputStream(Files.java= :172) > [junit4] 2> at org.apache.lucene.store.FSDirectory$FSIndexOut= put.(FSDirectory.java:265) > [junit4] 2> at org.apache.lucene.store.FSDirectory.createOutp= ut(FSDirectory.java:214) > [junit4] 2> at org.apache.lucene.store.MockDirectoryWrapper.c= reateOutput(MockDirectoryWrapper.java:607) > [junit4] 2> at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sConsumer.(MemoryDocValuesConsumer.java:79) > [junit4] 2> at org.apache.lucene.codecs.memory.MemoryDocValue= sFormat.fieldsConsumer(MemoryDocValuesFormat.java:60) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.getInstance(PerFieldDocValuesFormat.java:187) > [junit4] 2> at org.apache.lucene.codecs.perfield.PerFieldDocV= aluesFormat$FieldsWriter.addNumericField(PerFieldDocValuesFormat.java:111) > [junit4] 2> at org.apache.lucene.index.NumericDocValuesWriter= .flush(NumericDocValuesWriter.java:90) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.w= riteDocValues(DefaultIndexingChain.java:142) > [junit4] 2> at org.apache.lucene.index.DefaultIndexingChain.f= lush(DefaultIndexingChain.java:93) > [junit4] 2> at org.apache.lucene.index.DocumentsWriterPerThre= ad.flush(DocumentsWriterPerThread.java:419) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.doFlus= h(DocumentsWriter.java:503) > [junit4] 2> at org.apache.lucene.index.DocumentsWriter.flushA= llThreads(DocumentsWriter.java:615) > [junit4] 2> at org.apache.lucene.index.IndexWriter.doFlush(In= dexWriter.java:3019) > [junit4] 2> at org.apache.lucene.index.IndexWriter.flush(Inde= xWriter.java:2994) > [junit4] 2> at org.apache.lucene.index.IndexWriter.shutdown(I= ndexWriter.java:963) > [junit4] 2> at org.apache.lucene.index.IndexWriter.close(Inde= xWriter.java:1008) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:854) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] 2> at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] 2> at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] 2> at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] 2> at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> > [junit4] 2> jan 10, 2015 7:12:20 PM com.carrotsearch.randomizedtesti= ng.RandomizedRunner$QueueUncaughtExceptionsHandler uncaughtException > [junit4] 2> WARNING: Uncaught exception > > [...truncated too long message...] > > Wrapper.createOutput(MockDirectoryWrapper.java:607) > [junit4] > at org.apache.lucene.store.TrackingDirectoryWrapp= er.createOutput(TrackingDirectoryWrapper.java:43) > [junit4] > at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsWriter.(CompressingStoredFieldsWriter.java:105) > [junit4] > at org.apache.lucene.codecs.compressing.Compressi= ngStoredFieldsFormat.fieldsWriter(CompressingStoredFieldsFormat.java:128) > [junit4] > at org.apache.lucene.codecs.lucene50.Lucene50Stor= edFieldsFormat.fieldsWriter(Lucene50StoredFieldsFormat.java:179) > [junit4] > at org.apache.lucene.codecs.asserting.AssertingSt= oredFieldsFormat.fieldsWriter(AssertingStoredFieldsFormat.java:49) > [junit4] > at org.apache.lucene.index.DefaultIndexingChain.i= nitStoredFieldsWriter(DefaultIndexingChain.java:81) > [junit4] > at org.apache.lucene.index.DefaultIndexingChain.s= tartStoredFields(DefaultIndexingChain.java:258) > [junit4] > at org.apache.lucene.index.DefaultIndexingChain.p= rocessDocument(DefaultIndexingChain.java:295) > [junit4] > at org.apache.lucene.index.DocumentsWriterPerThre= ad.updateDocument(DocumentsWriterPerThread.java:231) > [junit4] > at org.apache.lucene.index.DocumentsWriter.update= Document(DocumentsWriter.java:449) > [junit4] > at org.apache.lucene.index.IndexWriter.updateDocu= ment(IndexWriter.java:1415) > [junit4] > at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1150) > [junit4] > at org.apache.lucene.index.IndexWriter.addDocumen= t(IndexWriter.java:1135) > [junit4] > at org.apache.lucene.index.TestDemoParallelLeafRe= ader$3.reindex(TestDemoParallelLeafReader.java:835) > [junit4] > at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getParallelLeafReader(TestDemoParallelLeafReader.java= :397) > [junit4] > at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader.getCurrentReader(TestDemoParallelLeafReader.java:161) > [junit4] > at org.apache.lucene.index.TestDemoParallelLeafRe= ader$ReindexingReader$ReindexingMergePolicy$ReindexingOneMerge.getMergeRead= ers(TestDemoParallelLeafReader.java:535) > [junit4] > at org.apache.lucene.index.IndexWriter.mergeMiddl= e(IndexWriter.java:3958) > [junit4] > at org.apache.lucene.index.IndexWriter.merge(Inde= xWriter.java:3559) > [junit4] > at org.apache.lucene.index.ConcurrentMergeSchedul= er.doMerge(ConcurrentMergeScheduler.java:519) > [junit4] > at org.apache.lucene.index.ConcurrentMergeSchedul= er$MergeThread.run(ConcurrentMergeScheduler.java:557) > [junit4] 2> NOTE: leaving temporary files on disk at: /usr/home/jenk= ins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lucene/build/core/= test/J3/temp/lucene.index.TestDemoParallelLeafReader 78910BB799798171-001 > [junit4] 2> NOTE: test params are: codec=3DAsserting(Lucene50): {id= =3DPostingsFormat(name=3DLuceneVarGapFixedInterval), text=3DTestBloomFilter= edLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=3D128))), = number=3DTestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucen= e50(blocksize=3D128)))}, docValues:{number_1=3DDocValuesFormat(name=3DLucen= e50), number_4=3DDocValuesFormat(name=3DAsserting), number_53=3DDocValuesFo= rmat(name=3DAsserting), number_84=3DDocValuesFormat(name=3DLucene50), numbe= r_78=3DDocValuesFormat(name=3DLucene50), number_83=3DDocValuesFormat(name= =3DMemory), number_71=3DDocValuesFormat(name=3DAsserting), number_0=3DDocVa= luesFormat(name=3DAsserting), number_87=3DDocValuesFormat(name=3DMemory), n= umber_30=3DDocValuesFormat(name=3DLucene50), number_67=3DDocValuesFormat(na= me=3DLucene50), number_57=3DDocValuesFormat(name=3DAsserting), number_20=3D= DocValuesFormat(name=3DAsserting), number_69=3DDocValuesFormat(name=3DMemor= y), number_59=3DDocValuesFormat(name=3DLucene50), number_73=3DDocValuesForm= at(name=3DLucene50), number_77=3DDocValuesFormat(name=3DLucene50), number_1= 75=3DDocValuesFormat(name=3DMemory), number_130=3DDocValuesFormat(name=3DLu= cene50), number_49=3DDocValuesFormat(name=3DLucene50), number_88=3DDocValue= sFormat(name=3DLucene50), number_70=3DDocValuesFormat(name=3DLucene50), num= ber_82=3DDocValuesFormat(name=3DAsserting), number_86=3DDocValuesFormat(nam= e=3DAsserting), number_3=3DDocValuesFormat(name=3DMemory), number_92=3DDocV= aluesFormat(name=3DLucene50), number_74=3DDocValuesFormat(name=3DLucene50),= number_198=3DDocValuesFormat(name=3DAsserting), number_160=3DDocValuesForm= at(name=3DMemory), number_68=3DDocValuesFormat(name=3DAsserting), number_75= =3DDocValuesFormat(name=3DAsserting), number_7=3DDocValuesFormat(name=3DMem= ory), number_76=3DDocValuesFormat(name=3DMemory), number_79=3DDocValuesForm= at(name=3DAsserting), number_2=3DDocValuesFormat(name=3DLucene50), number_7= 2=3DDocValuesFormat(name=3DMemory), number_22=3DDocValuesFormat(name=3DLuce= ne50), number_99=3DDocValuesFormat(name=3DLucene50), number_90=3DDocValuesF= ormat(name=3DMemory), number_66=3DDocValuesFormat(name=3DLucene50), number_= 80=3DDocValuesFormat(name=3DLucene50), number_91=3DDocValuesFormat(name=3DL= ucene50), number=3DDocValuesFormat(name=3DMemory), number_81=3DDocValuesFor= mat(name=3DLucene50), number_42=3DDocValuesFormat(name=3DAsserting)}, sim= =3DDefaultSimilarity, locale=3Dsk, timezone=3DAsia/Ashgabat > [junit4] 2> NOTE: FreeBSD 9.1-RELEASE-p3 amd64/Oracle Corporation 1.= 7.0_65 (64-bit)/cpus=3D16,threads=3D1,free=3D138795984,total=3D459276288 > [junit4] 2> NOTE: All tests run in this JVM: [TestTotalHitCountColle= ctor, TestPhraseQuery, TestDirectoryReaderReopen, TestSimpleAttributeImpl, = TestBasics, TestRegExp, TestSnapshotDeletionPolicy, TestBoolean2, Nested1, = TestRoaringDocIdSet, Test2BSortedDocValues, TestPositiveScoresOnlyCollector= , TestUnicodeUtil, TestPerFieldPostingsFormat2, TestWildcard, TestLucene50N= ormsFormat, TestBitDocIdSetBuilder, TestDocsAndPositions, Nested1, TestBagO= fPostings, TestPayloadNearQuery, TestFilteredQuery, TestExternalCodecs, Tes= tIndexWriterConfig, TestFieldCacheTermsFilter, TestWorstCaseTestBehavior, T= estFlex, TestPersistentSnapshotDeletionPolicy, TestCachingWrapperFilter, Te= st2BTerms, TestMultiTermQueryRewrites, TestSpanExplanationsOfNonMatches, Te= stBinaryTerms, TestSpanSearchEquivalence, TestBufferedIndexInput, TestMulti= LevelSkipList, TestSimilarityBase, TestDocInverterPerFieldErrorInfo, TestAs= sertingDocValuesFormat, TestStringHelper, TestBinaryDocument, TestDetermini= zeLexicon, TestIndexWriterWithThreads, TestSpansAdvanced, TestWindowsMMap, = TestSearch, Nested1, TestNeverDelete, TestDocIdSetBuilder, TestNoDeletionPo= licy, TestIndexWriterOnJRECrash, TestIndexWriterReader, TestCompressingTerm= VectorsFormat, TestIndexWriterMerging, TestFSTs, TestPackedInts, TestMultiM= Map, TestFlushByRamOrCountsPolicy, TestSloppyPhraseQuery, TestIndexWriterDe= lete, TestNorms, TestSearchWithThreads, TestSearchAfter, TestConcurrentMerg= eScheduler, TestHighCompressionMode, TestDocumentsWriterStallControl, TestT= opDocsMerge, TestStressIndexing, TestMinimize, TestRollingUpdates, TestNRTR= eaderWithThreads, TestPerFieldPostingsFormat, TestConsistentFieldNumbers, T= estSimpleExplanations, TestDocTermOrdsRangeFilter, TestCompressingStoredFie= ldsFormat, TestSegmentMerger, TestBagOfPositions, TestFixedBitSet, TestNRTC= achingDirectory, TestIndexWriterNRTIsCurrent, TestPostingsOffsets, TestMixe= dCodecs, TestTransactionRollback, TestUTF32ToUTF8, TestSpanExplanations, Te= stStressIndexing2, TestMultiPhraseQuery, TestLazyProxSkipping, TestDetermin= ism, TestCodecs, TestCharTermAttributeImpl, TestOmitPositions, TestPayloadE= xplanations, TestParallelLeafReader, TestBytesRefHash, TestThreadedForceMer= ge, TestIndexInput, TestSloppyPhraseQuery2, TestDisjunctionMaxQuery, TestFo= rceMergeForever, TestSameScoresWithThreads, TestQueryWrapperFilter, TestMul= tiFields, TestSimpleExplanationsOfNonMatches, TestTermVectorsReader, TestSp= ansAdvanced2, TestIntBlockPool, TestPrefixInBooleanQuery, TestTopDocsCollec= tor, TestSegmentTermEnum, TestUniqueTermCount, TestBooleanQuery, TestSortRa= ndom, TestPrefixCodedTerms, TestAllFilesHaveCodecHeader, TestDocument, Test= TermScorer, TestTopFieldCollector, TestSumDocFreq, TestIndexableField, Test= IndexFileDeleter, TestMatchAllDocsQuery, TestParallelTermEnum, TestSubScore= rFreqs, Nested, Nested, TestDateSort, TestVirtualMethod, TestDateTools, Tes= tAutomatonQueryUnicode, TestMockDirectoryWrapper, TestNoMergePolicy, TestNa= medSPILoader, TestBytesRef, TestNot, TestVersion, TestScoreCachingWrappingS= corer, TestTermdocPerf, TestIndexCommit, TestNewestSegment, Test2BBinaryDoc= Values, TestReusableStringReader, TestBlockPostingsFormat, TestForUtil, Tes= tLucene50CompoundFormat, TestFieldType, TestAllFilesHaveChecksumFooter, Tes= tCodecUtil, TestDemoParallelLeafReader] > [junit4] Completed on J3 in 610.62s, 5 tests, 1 error <<< FAILURES! > > [...truncated 15 lines...] > BUILD FAILED > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/bu= ild.xml:536: The following error occurred while executing this line: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/bu= ild.xml:477: The following error occurred while executing this line: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/bu= ild.xml:61: The following error occurred while executing this line: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/ex= tra-targets.xml:39: The following error occurred while executing this line: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lu= cene/build.xml:49: The following error occurred while executing this line: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lu= cene/common-build.xml:1363: The following error occurred while executing th= is line: > /usr/home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x/lu= cene/common-build.xml:970: There were test failures: 409 suites, 3237 tests= , 1 error, 37 ignored (27 assumptions) > > Total time: 39 minutes 43 seconds > Build step 'Invoke Ant' marked build as failure > Archiving artifacts > Sending artifact delta relative to Lucene-Solr-NightlyTests-5.x #716 > Archived 5 artifacts > Archive block size is 32768 > Received 0 blocks and 3856023 bytes > Compression is 0.0% > Took 1.5 sec > Recording test results > Email was triggered for: Failure > Sending email for trigger: Failure > > > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org > For additional commands, e-mail: dev-help@lucene.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscribe@lucene.apache.org For additional commands, e-mail: dev-help@lucene.apache.org