couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Panop S. <...@vizrt.com>
Subject RE: Deleting Document
Date Thu, 03 Feb 2011 08:54:20 GMT
Just found out on this, and looking into it.

http://wiki.apache.org/couchdb/Regenerating_views_on_update?action=show&redi
rect=RegeneratingViewsOnUpdate

Hope it help.

-----Original Message-----
From: Panop S. [mailto:pas@vizrt.com] 
Sent: Thursday, February 03, 2011 1:27 PM
To: user@couchdb.apache.org
Subject: RE: Deleting Document

Hi,

 

     When I sent the bulk delete command to about  10000 to couchDB (1.0.0),
and this can be deleted succesfully.

     Unfortunately when I sent another get command it get operation time
out.

      I looks at that status it is something like " View Group Indexer :
Processed 5700 of 10000 changes (57%)".

     I have to wait patiently until the status is completed so I can get the
documents.

 

     Is  something wrong in couchdb ???

 

Thanks,

 

 

 

-----Original Message-----
From: Zachary Zolton [mailto:zachary.zolton@gmail.com]
Sent: Wednesday, February 02, 2011 10:26 PM
To: user@couchdb.apache.org
Subject: Re: Deleting Document

 

Bulk doc updates (of which deletion is a just a special case) has been known
to perform well for thousands of documents at a time.

 

I'd say querying your view with the limit parameter and performing bulk
updates in batches sounds like a good idea. If you anticipate needing to
work with 50,000 documents at a time you had spend some benchmarking.
Experiment with different batch size and perhaps share your results.

 

-Zach

 

On Wed, Feb 2, 2011 at 3:33 AM, Panop S. < <mailto:pas@vizrt.com>
pas@vizrt.com> wrote:

> Hi,

>            Thanks for your suggestion, for bulk deletion let's say we

> have

> 50000 in one parent.

>             Is it good idea to load all at once with bulk deletion ?

>             Or  I need reduce number to on view like limit=1000 and

> issue multiple requests.

> 

> Big Thanks Again,

> 

> 

> 

> 

> 

> -----Original Message-----

> From: Zachary Zolton [mailto:zachary.zolton@gmail.com]

> Sent: Tuesday, February 01, 2011 9:55 PM

> To: user@couchdb.apache.org

> Subject: Re: Deleting Document

> 

> You could eliminate the recursion in your algorithms by storing an

> array of ancestor ID in each of your documents.

> 

> Then you could just write a by-ancestor view, like so:

> 

> function(doc) {

>  doc.ancestor_ids.forEach(function(id) {

>      emit(id, null);

>  });

> }

> 

> When it's time to delete a document, you just query this view for all

> docs that share the ancestor's ID and delete the matching documents.

> Note that if you want to re-parent a doc you will also need to update

> all its children in this scheme.

> 

> For both of these operations you could consider using bulk doc updates:

>  <http://is.gd/M9aNkg> http://is.gd/M9aNkg

> 

> Depending on your project's requirements, you may benefit by using a

> graph database instead.

> 

> 

> Cheers,

> 

> Zach

> 

> On Tue, Feb 1, 2011 at 2:53 AM, Panop S. < <mailto:pas@vizrt.com>
pas@vizrt.com> wrote:

>> Hi,

>> 

>> 

>> 

>>     I have design a document in couchdb like a tree file structure as

>> 

>> 

>> 

>>      1

>> 

>>      |--2

>> 

>>      |    |--4

>> 

>>      |

>> 

>>      |--3

>> 

>> 

>> 

>>      Doc id : 2

>> 

>>      Parent: 1

>> 

>> 

>> 

>>      Doc id : 3

>> 

>>      Parent: 1

>> 

>> 

>> 

>>      Doc id : 4

>> 

>>      Parent: 1

>> 

>> 

>> 

>>      So when I query I emit it by parent Id .

>> 

>>      Then when get  /id/1 , I will get docid =  2 ,3

>> 

>> 

>> 

>>     So if I would like to delete 1 require 2 steps in C# application

>> by using recursive call.

>> 

>> 

>> 

>> 1.       querying /id/1  and issue get command id = 2  and issue 

>> delete command id = 2

>> 

>> 2.       querying /id/2  and issue get command id = 4  and issue 

>> delete command id = 4

>> 

>> 

>> 

>>    The problem is if there are a lot of documents it is slow down

>> performance by recursive in C# code

>> 

>> and query by loading huge documents  and the others client request

>> cannot process.

>> 

>> 

>> 

>>    As I can think of now for a lot of document by using  &limit=some

>> number to reduce a lot of tons

>> 

>> Of document to be load.

>> 

>> 

>> 

>>       And also Can I write some operation to delete indexed id on

>> javascript of couchdb ? if so, is this good idea ?

>> 

>>          Any Idea ?

>> 

>> 

>> 

>> Big Thanks,

>> 

>> 

>> 

>> 

>> 

>> 

>> 

>> 

> 

> 

> 

 



Mime
View raw message