couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "R.J. Steinert">
Subject Re: Best way to backup and restore after going offline
Date Mon, 13 May 2013 16:49:09 GMT
This thread is an interesting one for the ground computing community.  It
points to a way of distributing databases on SD Cards without needing a
second computer to sync the database upon arrival.  It's possible to put
the {database name}.couch that you want to send on a memory stick and and
then move it to the target's computer's database directory (in the case of
the BeLL, that's /var/lib/couchdb/1.2.0/) under a different name.  You then
replicate that newly arrived database to the local one waiting to be
synced.  This still means you have to have enough memory for two databases
and you can't just send the changes since X date.  This could problem could
be mitigated by having a database on every taget computer that is a symlink
to a drive on the USB port.

The downside to this approach is that there are still some things that we
would have to program to automate this (make it easy).  1) The process of
getting the {database name}.couch file and 2) continuous sync between the
symlinked sync database and the local database (continuous pull and push
replication from a sometimes broken symlinked database might be a bad
thing).  For now, the bell.local/sync.local model as proposed in the BeLL
Ground Server Manual is the most stable approach I know of (

Lastly, it would still be cool to be able to get a diff of a database from
X date that could be used to sync changes to remote offline databases.
 That would mean if you have a 5GB database, a 2GB USB drive, and 1GB in
changes since X date, you could still send the sync on the USB drive.


---------- Forwarded message ----------
From: Andy Ennamorato <>
Date: Fri, May 10, 2013 at 11:45 AM
Subject: Re: Best way to backup and restore after going offline
To: "" <>


Thanks that was definitely helpful.

We're having issues replicating but once it finishes we'll give this a shot.


Sent from my iPhone

On May 7, 2013, at 3:59 PM, Nils Breunese <> wrote:

> andy e <> wrote:
>> We're trying to replicate a database while connected to the great big
>> Internet (the npm repo, actually). But after it is finished replicating
>> need to take our copy and run it in a different and offline environment.
>> What's the best way to proceed? Just backup the database file(s), then
>> them over to our offline instance (we do this via disc/usb/etc). Anything
>> to be aware of when doing this ("make sure to grab this file", "run this
>> command first before importing")?
> A single .couch file represents a single database. You can just copy over
these files. There's even no need to shut down the source instance (unlike
when using something like MySQL for instance).
> If you also want to copy over the generated view indexes, you can also
copy over the hidden directories in the data directory. This only makes
sense if regenerating these indexes on the target instance takes a lot of
>> Is there any way to pick up just the changed documents, so that the
>> time we do this, we only have to transfer what is new (versus copying the
>> entire thing again)? That's not too big of a deal if so but would be a
>> nice-to-have.
> Updating your online copy could be done by rerunning replication from npm
to your online instance. You could use rsync to just copy over the
differences in the file at the block level. Or you could start a second
CouchDB instance on the offline machine (use a different IP address and/or
port) using the updated database files on the transfer medium and run
replication between those two CouchDB instances on the offline machine.
> HTH, Nils.

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message