lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Markus Jelsma <markus.jel...@openindex.io>
Subject Small setFacetLimit() terminates Solr
Date Thu, 02 Jun 2016 12:47:34 GMT
Hello,

I ran accros an awkward situation where is collect all ~7.000.000 distinct values for a field
via facetting. To keep things optimized and reduce memory consumption i don't do setFacetLimit(-1)
but a reasonable limit of 10.000 or 100.000.

To my surprise, Solr just stops or crashes. So, instead of decreasing the limit, i increased
the limit to a 1.000.000! And it works! The weird thing is that with a limit of 100.000 or
200.000 and a heap of 3.5 GB, Solr stops. But with a limit of 1.000.000 and a reduced heap
of 2.5 GB, it just works fine.

When it fails, it some times doesn't crash, but throw

396882 WARN  (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:9983) [   ] o.a.z.s.NIOServerCnxn caught
end of stream exception
EndOfStreamException: Unable to read additional data from client sessionid 0x155111cc413000d,
likely client has closed socket
        at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:228)
        at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208)
        at java.lang.Thread.run(Thread.java:745)s a

This is on Solr 6.0.0 in cloud mode. 3 shards and 2 replica's on my local machine.

What is happening here?

Many thanks,
Markus

Mime
View raw message