lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Erick Erickson <>
Subject Re: Data Import Handler, also "Real Time" index updates
Date Fri, 03 Mar 2017 17:30:28 GMT
One way to handle this (presuming SolrCloud) is collection aliasing.
You create two collections, c1 and c2. You then have two aliases. when
you start "index" is aliased to c1 and "search" is aliased to c2. Now
do your full import  to "index" (and, BTW, you'd be well advised to do
at least a hard commit openSearcher=false during that time or you risk
replaying all the docs in the tlog).

When the full import is done, switch the aliases so "search" points to c1 and
"index" points to c2. Rinse. Repeat. Your client apps always use the same alias,
the alias switching makes whether c1 or c2 is being used transparent.
By that I mean your user-facing app uses "search" and your indexing client
uses "index".

You can now do your live updates to the "search" alias that has a soft
commit set.
Of course you have to have some mechanism for replaying all the live updates
that came in when you were doing your full index into the "indexing"
alias before
you switch, but you say you have that handled.


On Fri, Mar 3, 2017 at 9:22 AM, Alexandre Rafalovitch
<> wrote:
> On 3 March 2017 at 12:17, Sales <> wrote:
>> When we enabled those, during the index, the data disappeared since it kept soft
committing during the import process,
> This part does not quite make sense. Could you expand on this "data
> disappeared" part to understand what the issue is.
> The main issue with "update" is that all fields (apart from pure
> copyField destinations) need to be stored, so the document can be
> reconstructed, updated, re-indexed. Perhaps you have something strange
> happening around that?
> Regards,
>    Alex.
> ----
> - Resources for Solr users, new and experienced

View raw message