lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tim Potter <>
Subject RE: monitoring solr logs
Date Mon, 30 Dec 2013 16:53:36 GMT
We're (LucidWorks) are actively developing on logstash4solr so if you have issues, let us know.
So far, so good for me but I upgraded to logstash 1.3.2 even though the logstash4solr version
includes 1.2.2 you can use the newer one. I'm not quite in production with my logstash4solr
<- rabbit-mq <- log4j <- Solr solution yet though ;-)

Yeah, 50GB is too much logging for only 150K docs. Maybe start by filtering by log level (WARN
and more severe). If a server crashes, you're likely to see some errors in the logstash side
but sometimes you may have to SSH to the specific box and look at the local log (so definitely
append all messages to the local Solr log too), I'm using something like the following for
local logging:

log4j.rootLogger=INFO, file
log4j.appender.file.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c{3} %x - %m%n

Timothy Potter
Sr. Software Engineer, LucidWorks

From: adfel70 <>
Sent: Monday, December 30, 2013 9:34 AM
Subject: RE: monitoring solr logs

Actually I was considering using logstash4solr, but it didn't seem mature
does it work fine? any known bugs?

are you collecting the logs in the same solr cluster you use for the
production systems?
if so, what will you do if for some reason solr is down and you would like
to analyze the logs to see what happend?

btw, i started a new solr cluster with 7 shards, replicationfactor=3 and run
indexing job of 400K docs,
it got stuck on 150K because I used Socketappender directly to write to
logstash and logstash disk got full.

that's why I moved to using AsyncAppender, and I plan on moving to using
but this is also why I wanted to filter some of the logs. indexing 150K docs
prodcued 50GB of logs.
this seemed too much.

Tim Potter wrote
> I'm using logstash4solr ( for something similar
> ...
> I setup my Solr to use Log4J by passing the following on the command-line
> when starting Solr:
> -Dlog4j.configuration=file:///$SCRIPT_DIR/
> Then I use a custom Log4J appender that writes to RabbitMQ:
> You can then configure a RabbitMQ input for logstash -
> This decouples the log writes from log indexing in logstash4solr, which
> scales better for active Solr installations.
> Btw ... I just log everything from Solr using this approach but you can
> use standard Log4J configuration settings to limit which classes / log
> levels to send to the RabbitMQ appender.
> Cheers,
> Timothy Potter
> Sr. Software Engineer, LucidWorks
> ________________________________________
> From: adfel70 &lt;

> adfel70@

> &gt;
> Sent: Monday, December 30, 2013 8:15 AM
> To:

> solr-user@.apache

> Subject: monitoring solr logs
> hi
> i'm trying to figure out which solr and zookeeper logs i should monitor
> and
> collect.
> All the logs will be written to a file but I want to collect some of them
> with logstash in order to be able to analyze them efficiently.
> any inputs on logs of which classes i should collect?
> thanks.
> --
> View this message in context:
> Sent from the Solr - User mailing list archive at

View this message in context:
Sent from the Solr - User mailing list archive at

View raw message