lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pranav Prakash <>
Subject How to achieve Indexing @ 270GiB/hr
Date Tue, 04 Oct 2011 13:17:34 GMT

While going through the article 265% indexing speedup with Lucene's
concurrent flushing<>
was stunned by the endless possibilities in which Indexing speed could be

I'd like to take inputs from everyone over here as to how to achieve this
speed. As far as I understand there are two broad ways of feeding data to
Solr -

   1. Using DataImportHandler
   2. Using HTTP to POST docs to Solr.

The speeds at which the article describes indexing seems kinda too much to
expect using the second approach. Or is it possible using multiple instances
feeding docs to Solr?

My current setup does the following -

   1. Execute SQL queries to create database of documents that needs to be
   2. Go through the columns one by one, and create XMLs for them and send
   it over to Solr in batches of max 500 docs.

Even if using DataImportHandler what are the ways this could be optimized?
If I am able to solve the problem of indexing data in our current setup, my
life would become a lot easier.

*Pranav Prakash*

"temet nosce"

Twitter <> | Blog <> |
Google <>

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message