couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tom Nichols <>
Subject Insert performance
Date Mon, 04 May 2009 15:40:53 GMT
Hi, I have some questions about insert performance.

I have a single CouchDB 0.9.0 node running on small EC2 instance.  I
attached a huge EBS volume to it and mounted it where CouchDB's data
files are stored.  I fired up about ruby scripts running inserts and
after a weekend I only have about 30GB/ 12M rows of data...  Which
seems small.  'top' tells me that my CPU is only about 30% utilized.

Any idea what I might be doing wrong?  I pretty much just followed
these instructions:

My ruby script looks like this:
#!/usr/bin/env ruby
#Script to load random data into CouchDB

require 'rubygems'
require 'couchrest'

db = CouchRest.database! "{ARGV[0]}"
puts "Created database: #{ARGV[0]}"

max = 9999999999999999
while 1
        puts 'loading...'
        for val in 0..max
                db.save_doc({ :key => val, 'val one' => "val ${val}",
'val2' => "#{ARGV[1]} #{val}" })

Thanks in advance...

View raw message