lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vinay Pothnis <poth...@gmail.com>
Subject Re: [solr cloud] solr hangs when indexing large number of documents from multiple threads
Date Mon, 24 Jun 2013 20:28:52 GMT
Here is the ulimit -a output:

  core file size           (blocks, -c)  0  data seg size            (kbytes,
-d)  unlimited  scheduling priority              (-e)  0  file size
                (blocks, -f)  unlimited  pending signals
(-i)  179963  max locked memory        (kbytes, -l)  64  max memory size
          (kbytes, -m)  unlimited  open files                       (-n)
32769  pipe size             (512 bytes, -p)  8  POSIX message queues
    (bytes,
-q)  819200  real-time priority               (-r)  0  stack size
(kbytes, -s)  10240  cpu time                (seconds, -t)  unlimited  max
user processes               (-u)  140000  virtual memory           (kbytes,
-v)  unlimited  file locks                       (-x)  unlimited

On Mon, Jun 24, 2013 at 12:47 PM, Yago Riveiro <yago.riveiro@gmail.com>wrote:

> Hi,
>
> I have the same issue too, and the deploy is quasi exact like than mine,
> http://lucene.472066.n3.nabble.com/updating-docs-in-solr-cloud-hangs-td4067388.html#a4067862
>
> With some concurrence and batches of 10 solr apparently have some deadlock
> distributing updates
>
> Can you dump the configuration of the ulimit on your servers?, some people
> had the same issues because they are reach the ulimit maximum defined for
> descriptor and process.
>
> --
> Yago Riveiro
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Monday, June 24, 2013 at 7:49 PM, Vinay Pothnis wrote:
>
> > Hello All,
> >
> > I have the following set up of solr cloud.
> >
> > * solr version 4.3.1
> > * 3 node solr cloud + replciation factor 2
> > * 3 zoo keepers
> > * load balancer in front of the 3 solr nodes
> >
> > I am seeing this strange behavior when I am indexing a large number of
> > documents (10 mil). When I have more than 3-5 threads sending documents
> (in
> > batch of 20) to solr, sometimes solr goes into a hung state. After this
> all
> > the update requests get timed out. What we see via AppDynamics (a
> > performance monitoring tool) is that there are a number of threads that
> are
> > stalled. The stack trace for one of the threads is shown below.
> >
> > The cluster has to be restarted to recover from this. When I reduce the
> > concurrency to 1, 2, 3 threads, then the indexing goes through smoothly.
> > Any pointers as to what could be wrong here?
> >
> > We send the updates to one of the nodes in the solr cloud through a load
> > balancer.
> >
> > Thanks
> > Vinay
> >
> > Thread Name:qtp2141131052-78
> > ID:78
> > Time:Fri Jun 21 23:20:22 GMT 2013
> > State:WAITING
> > Priority:5
> >
> > sun.misc.Unsafe.park(Native Method)
> > java.util.concurrent.locks.
> > LockSupport.park(LockSupport.java:186)
> >
> java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:834)
> >
> java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:994)
> >
> java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
> > java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
> >
> org.apache.solr.util.AdjustableSemaphore.acquire(AdjustableSemaphore.java:61)
> >
> org.apache.solr.update.SolrCmdDistributor.submit(SolrCmdDistributor.java:418)
> >
> org.apache.solr.update.SolrCmdDistributor.submit(SolrCmdDistributor.java:368)
> >
> org.apache.solr.update.SolrCmdDistributor.flushAdds(SolrCmdDistributor.java:300)
> >
> org.apache.solr.update.SolrCmdDistributor.finish(SolrCmdDistributor.java:96)
> >
> org.apache.solr.update.processor.DistributedUpdateProcessor.doFinish(DistributedUpdateProcessor.java:462)
> >
> org.apache.solr.update.processor.DistributedUpdateProcessor.finish(DistributedUpdateProcessor.java:1178)
> >
> org.apache.solr.update.processor.LogUpdateProcessor.finish(LogUpdateProcessorFactory.java:179)
> >
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:83)
> >
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
> > org.apache.solr.core.SolrCore.execute(SolrCore.java:1820)
> >
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:656)
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:359)
> >
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:155)
> >
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1423)
> >
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:450)
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:138)
> >
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:564)
> >
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:213)
> >
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1083)
> > org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:379)
> >
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:175)
> >
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1017)
> >
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:136)
> >
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:258)
> >
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:109)
> >
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
> > org.eclipse.jetty.server.Server.handle(Server.java:445)
> > org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:260)
> >
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:225)
> >
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.run(AbstractConnection.java:358)
> >
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:596)
> >
> org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:527)
> > java.lang.Thread.run(Thread.java:722
> >
> >
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message