incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ara howard <ara.t.how...@gmail.com>
Subject dirty reads - update strategies
Date Thu, 13 Nov 2008 17:01:02 GMT

what are people's strategies for dealing with the following scenario

doc_a = get 'id_a'

doc_b = get 'id_b'

obj_c = { 'sum' : doc_a.x + doc_b.y }

put obj_c


this kind of thing is tricky even in a traditional RDBMS, since the  
default transaction level may or may not allow the application to see  
an uncommitted write by another transaction.

the only way i can think of to get consistency from an op like the  
above would be to do

bulk_put [ obj_c, doc_a, doc_b ]

in other words, if you are ever going to compute values to from couch  
docs to produce another doc, it would seem that's it's required to put  
*all* read information back in order to ensure that the sources have  
not changed since the time that you read them.  the issue with this,  
of course, is that a result computed from many documents is going to  
cause exponential slowdown since the potential for overlapping writes  
will increase with the number of documents and also the size of  
updates themselves will increase similarly.

a solution i can image is something like

list = get 'some_view'

obj = computed_value_from list

obj[ '_depends_on' ] = list.map{|element| [element.id, element.rev]}

put obj


so basically a method to do a put with not only your rev, but that of  
'n' dependent docs where only the [id, rev] pair for the dependent  
docs need be posted.  am i making any sense here?

cheers.



a @ http://codeforpeople.com/
--
we can deny everything, except that we have the possibility of being  
better. simply reflect on that.
h.h. the 14th dalai lama




Mime
View raw message