lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From steven crichton <stevencrich...@mac.com>
Subject Re: Solr Stalls on Bulk indexing, no logs or errors
Date Wed, 04 Dec 2013 20:19:02 GMT
Yes I can continue to query after this importer goes down and whilst it running.

The bulk commit is done via a JSON handler in php. There is 121,000 records that need to go
into the index. So this is done in 5000 chunked mySQL retrieve calls and parsing to the data
as required. 

	workflow:
		
		get record
		create {add doc… } JSON
		Post to CORE/update/json
		

I stopped doing a hard commit every 1000 records. To see if that was an issue.


the auto commit settings are ::

<autoCommit>
      <maxDocs>${solr.autoCommit.MaxDocs:5000}</maxDocs>
      <maxTime>${solr.autoCommit.MaxTime:24000}</maxTime>
</autoCommit>


I’ve pretty much worked out of the drupal schemas for SOLR 4
https://drupal.org/project/apachesolr

At one point I thought it could be malformed data, but even reducing the records down to just
the id and title now .. it crashes at the same point. As in the query still works but the
import handler does nothing at all


Tomcat logs seem to indicate no major issues.


There’s not a strange variable that is set to make an upper index limit is there?

Regards,
Steven



On 4 Dec 2013, at 20:02, Erick Erickson [via Lucene] <ml-node+s472066n4104984h38@n3.nabble.com>
wrote:

> There's a known issue with SolrCloud with multiple shards, but 
> you haven't told us whether you're using that. The test for 
> whether you're running in to that is whether you can continue 
> to _query_, just not update. 
> 
> But you need to tell us more about our setup. In particular 
> hour commit settings (hard and soft), your solrconfig settings, 
> particularly around autowarming, how you're "bulk indexing", 
> SolrJ? DIH? a huge CSV file? 
> 
> Best, 
> Erick 
> 
> 
> On Wed, Dec 4, 2013 at 2:30 PM, steven crichton <[hidden email]>wrote: 
> 
> > I am finding with a bulk index using SOLR 4.3 on Tomcat, that when I reach 
> > 69578 records the server stops adding anything more. 
> > 
> > I've tried reducing the data sent to the bare minimum of fields and using 
> > ASC and DESC data to see if it could be a field issue. 
> > 
> > Is there anything I could look at for this? As I'm not finding anything 
> > similar noted before. Does tomcat have issues with closing connections that 
> > look like DDOS attacks? Or could it be related to too many commits in too 
> > short a time? 
> > 
> > Any help will be very greatly appreciated. 
> > 
> > 
> > 
> > -- 
> > View this message in context: 
> > http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981.html
> > Sent from the Solr - User mailing list archive at Nabble.com. 
> > 
> 
> 
> If you reply to this email, your message will be added to the discussion below:
> http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981p4104984.html
> To unsubscribe from Solr Stalls on Bulk indexing, no logs or errors, click here.
> NAML





--
View this message in context: http://lucene.472066.n3.nabble.com/Solr-Stalls-on-Bulk-indexing-no-logs-or-errors-tp4104981p4104990.html
Sent from the Solr - User mailing list archive at Nabble.com.
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message