lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Doug Turnbull <>
Subject Re: Query always fail if row value is too high
Date Mon, 09 Feb 2015 15:47:46 GMT
Yago, I can't speak to the specific error. However, that's quite a lot of
rows from one request. Solr and Lucene need to build that in memory, which
can create all kinds of heap issues. You could try using start to page
through the rows, but you'll run into a problem known as deep paging.

In short rows/start are optimized for users that want to page through the
top couple of pages of results. Performance gets progressively worse as you
go through the results on as Lucene needs to build up all the search
results on every subsequent request.

The solution to problem "I want to pull every/many results from my index"
is to use cursors.

Hossman has a good article linked here. I wrote one too, here these might

Further, you probably want to set an explicit max on the number of rows for
your Solr cluster to prevent folks from causing your Solr to collapse
because they specified a billion rows.

Hope that helps,

On Mon, Feb 9, 2015 at 10:29 AM, yriveiro <> wrote:

> I'm trying to retrieve from Solr a query in CSV format with around 500K
> registers and I always get this error:
> "Expected mime type application/octet-stream but got application/xml. <?xml
> version=\"1.0\" encoding=\"UTF-8\"?>\n<response>\n<lst name=\"error\"><str
> name=\"msg\">application/x-www-form-urlencoded content length (6040427
> bytes) exceeds upload limit of 2048 KB</str><int
> name=\"code\">400</int></lst>\n</response>\n"
> If the rows value is lower, like 50000 the query doesn't fail.
> What I'm doing wrong?
> -----
> Best regards
> --
> View this message in context:
> Sent from the Solr - User mailing list archive at

Doug Turnbull
Search Relevance Lead
OpenSource Connections <>

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message