couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Newson <robert.new...@gmail.com>
Subject Re: curl -d outof memory
Date Thu, 05 May 2011 11:43:15 GMT
Hi,

Well, curl will load that document fully into memory and so will
couchdb. So that requires 6.2 gig of memory assuming we hold it
efficiently in both places (which is not true for couchdb). You can
try streaming the file with -T <filename>, which means curl won't
consume 3.1 gb of memory, but a better approach would be to use
smaller uploads to bulk_docs. The processing time for a massive JSON
blob alone is likely to be excessive, once you get that far.

B.

On 5 May 2011 12:34, Mauro Fagnoni <mauro.fagnoni@gmail.com> wrote:
> Hi all, i need help to resolve this error while I was uploading my
> file.json into a database.
>
> I use this procedure
>
> $ curl -d @your_file.json -X POST http://127.0.0.1:5984/mydb/_bulk_docs
>
> but a fews moment later
>
> return me this error
>
> curl: option -d: out of memory.
>
> any idea to solve this problem? Is possible that my json file is too
> big?? (file is 3.1gb)
>
> Regards
>
> --
> -----------------------------------------------
> [-------WHOAMI------] Mauro Fagnoni
> [----------ICQ#---------] 279572903
> [--------MSNID--------] maurofagnoni@yahoo.it
> [--YAHOOMSNID--] maurofagnoni@gmail.com
> [--GOOGLETALK--] mauro.fagnoni@gmail.com
> [-GOOGLEWAVE-] mauro.fagnoni@googlewave.com
> [------JABBER-------] mauro.fagnoni@gmail.com
> [------SKYPE--------] mauro.fagnoni
> [-----LinuxUser#----] 346345
> [----------Blog---------] http://kingmauro.wordpress.com
> -----------------------------------------------
>

Mime
View raw message