couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Johannes J. Schmidt" <schm...@netzmerk.com>
Subject Re: CouchDB _changes + continuous + Ajax
Date Sat, 25 Dec 2010 09:37:42 GMT
Hi,

You could check the length of req.responseText and call abort() if it
exceeds a limit. Or you can check for the number of revs the feed is
reporting and implement a limit of revs per connection.

Btw. Depending on your application you should query the database info
object (/dbname) to get the latest seq (update_seq).

Greetings
Johannes

Am Freitag, den 24.12.2010, 08:14 -0800 schrieb N/A N/A:
> Hi,
> 
> Thank you! Hope you are well too...
> 
> Yes. First i'm doing changes poll without feed, so i can retrieve last_seq 
> key/value. Next i'm doing Ajax request like your with feed, heartbeat and even 
> include_docs.
> This is working just fine. The code bellow is what i mean. There is function in 
> couch.js "CouchDB.request". I patched it a bit for my needs for continuous 
> _changes read.
> I can post all of the function code later. But what i mean is this:
> 
> 
>   var req = CouchDB.newXhr();
>   if(uri.substr(0, "http://".length) != "http://") {
>     uri = CouchDB.urlPrefix + uri
>   }
>   req.open(method, uri, true);
>   req.onreadystatechange = function () {
>       console.log(req.responseText);
>   }
> 
> responseText is read only. While the socket is open every 
> time onreadystatechange trigger, it only appends the result. So responseText is 
> growing really big and after N seconds/hours/days maybe ill be out of memory. 
> What to do, so i can avoid this? If something is not clear enough i'll post some 
> code later with some more explanations and logs.
> 
> Thanks
> 
> 
> 
> ________________________________
> From: Cliff Williams <cliffywills@aol.com>
> To: user@couchdb.apache.org
> Sent: Fri, December 24, 2010 5:27:18 PM
> Subject: Re: CouchDB _changes + continuous + Ajax
> 
> N/A N/A
> 
> I hope you are well and looking forward to Christmas
> 
> I am not sure that I fully understand but are you querying using last 
> sequence ?
> 
> I use
> 
> changesurl = "http://"+ host +":"+port.tostring+"/"+database+ 
> "/_changes?feed=continuous&include_docs=true&heartbeat=10000 
> +"&since="+lastseq",
> 
> 
> I can let you have a copy of a skeleton node.js program that I use if 
> you think that it would help.
> 
> best regards
> 
> cliff
> 
> 
> On 24/12/10 14:04, N/A N/A wrote:
> > Hi there,
> >
> > I am trying to use continuous changes of CouchDB + some java script. Using
> > couch.js, because it is nice and clean template for experiments i manage to
> > establish a connection to CouchDB and my webapp. Everything is working fine
> > except one thing. Because the socket is still open, my responseText in ajax is
> > just appending the JSON answers, so it is getting bigger and bigger and 
> bigger.
> > With heartbeat of 100, memory usage is growing pretty fast. JSON data transfer
> > could be really high sometimes and dataloss is not an option for me. One
> > solution for me was just to use websocket + node.js(or similar). Or maybe to
> > split() the answer in array and when it reach certain length to reopen the
> > socket with current sequence of _changes. But this solution is not for my 
> taste
> > and looks for me like a hack and not like solution. So, if someone is familiar
> > with continuous changes, is it possible to give me some advice how we can use
> > continuous changes + ajax without responseText getting so big? Any workaround 
> ?
> > I prefer to not use websocket stuff(or something), because i prefer to keep
> > things clean and stick only with CouchDB.
> >
> > Thanks in advance
> >
> >
> >
> >
> 
> 
> 
>       

-- 
Netzmerk GbR
Johannes J. Schmidt

http://netzmerk.com
01525 378 65 21

Am Vierstückenpfuhl 3a
14167 Berlin

Mime
View raw message