incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Zachary Zolton <zachary.zol...@gmail.com>
Subject Re: Insert performance
Date Mon, 04 May 2009 15:58:06 GMT
Short answer: use db.save_doc(hash, true) for bulk_docs behavior.

Also, consider moving this thread to the CouchRest Google Group:
http://groups.google.com/group/couchrest/topics

Cheers,
zdzolton

On Mon, May 4, 2009 at 10:40 AM, Tom Nichols <tmnichols@gmail.com> wrote:
> Hi, I have some questions about insert performance.
>
> I have a single CouchDB 0.9.0 node running on small EC2 instance.  I
> attached a huge EBS volume to it and mounted it where CouchDB's data
> files are stored.  I fired up about ruby scripts running inserts and
> after a weekend I only have about 30GB/ 12M rows of data...  Which
> seems small.  'top' tells me that my CPU is only about 30% utilized.
>
> Any idea what I might be doing wrong?  I pretty much just followed
> these instructions:
> http://wiki.apache.org/couchdb/Getting_started_with_Amazon_EC2
>
> My ruby script looks like this:
> #!/usr/bin/env ruby
> #Script to load random data into CouchDB
>
> require 'rubygems'
> require 'couchrest'
>
> db = CouchRest.database! "http://127.0.0.1:5984/#{ARGV[0]}"
> puts "Created database: #{ARGV[0]}"
>
> max = 9999999999999999
> while 1
>        puts 'loading...'
>        for val in 0..max
>                db.save_doc({ :key => val, 'val one' => "val ${val}",
> 'val2' => "#{ARGV[1]} #{val}" })
>        end
> end
>
>
> Thanks in advance...
>

Mime
View raw message