lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benson Margulies <bimargul...@gmail.com>
Subject Not so concurrent concurrency
Date Wed, 29 May 2013 01:00:08 GMT
 I can't quite apply SolrMeter to my problem, so I did something of my
own. The brains of the operation are the function here.

This feeds a ConcurrentUpdateSolrServer about 95 documents, each about
10mb, and 'threads' is six. Yet Solr just barely uses more than one
core.

   private long doIteration(File[] filesToRead) throws IOException,
SolrServerException {
        ConcurrentUpdateSolrServer concurrentServer = new
ConcurrentUpdateSolrServer(launcher.getSolrServer().getBaseURL(),
1000, threads);
        UpdateRequest updateRequest = new UpdateRequest(updateUrl);
        updateRequest.setCommitWithin(1);
        Stopwatch stopwatch = new Stopwatch();

        List<File> allFiles = Arrays.asList(filesToRead);
        Iterator<File> fileIterator = allFiles.iterator();
        while (fileIterator.hasNext()) {
            List<File> thisBatch = Lists.newArrayList();
            int batchByteCount = 0;
            while (batchByteCount < BATCH_LIMIT && fileIterator.hasNext()) {
                File thisFile = fileIterator.next();
                thisBatch.add(thisFile);
                batchByteCount += thisFile.length();
            }
            LOG.info(String.format("update %s files", thisBatch.size()));
            updateRequest.setDocIterator(new
StreamingDocumentIterator(thisBatch));
            stopwatch.start();
            concurrentServer.request(updateRequest);
            concurrentServer.blockUntilFinished();
            stopwatch.stop();
        }

Mime
View raw message