Return-Path: X-Original-To: apmail-couchdb-user-archive@www.apache.org Delivered-To: apmail-couchdb-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9307B849D for ; Tue, 30 Aug 2011 08:36:01 +0000 (UTC) Received: (qmail 28138 invoked by uid 500); 30 Aug 2011 08:35:58 -0000 Delivered-To: apmail-couchdb-user-archive@couchdb.apache.org Received: (qmail 27895 invoked by uid 500); 30 Aug 2011 08:35:49 -0000 Mailing-List: contact user-help@couchdb.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@couchdb.apache.org Delivered-To: mailing list user@couchdb.apache.org Received: (qmail 27878 invoked by uid 99); 30 Aug 2011 08:35:47 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Aug 2011 08:35:47 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.214.52] (HELO mail-bw0-f52.google.com) (209.85.214.52) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Aug 2011 08:35:40 +0000 Received: by bkbzs2 with SMTP id zs2so7038297bkb.11 for ; Tue, 30 Aug 2011 01:35:19 -0700 (PDT) MIME-Version: 1.0 Received: by 10.204.143.151 with SMTP id v23mr1696606bku.65.1314693319388; Tue, 30 Aug 2011 01:35:19 -0700 (PDT) Received: by 10.204.38.142 with HTTP; Tue, 30 Aug 2011 01:35:19 -0700 (PDT) Date: Tue, 30 Aug 2011 10:35:19 +0200 Message-ID: Subject: cUrl _bulk_docs upload runs out of memory From: Monica Razdan To: user@couchdb.apache.org Content-Type: multipart/alternative; boundary=0015175d06064a6d0a04abb4e43f X-Virus-Checked: Checked by ClamAV on apache.org --0015175d06064a6d0a04abb4e43f Content-Type: text/plain; charset=ISO-8859-1 Hello, I am exporting data from mysql to couchdb using _bulk_docs. Some of the json files are in gigs. When I try to bulk upload docs curl runs out of memory reading these files. Is there a way to stream the files while sending a post request? For now I'm splitting the bigger files but then some of them still end up as big as 800mb because some of the file have like 2,40,000 of rowset. What is the max size limit when posting files using curl ? Is there a alternate to curl for uploading chunk of json files to couchdb ? Thanks, Monica --0015175d06064a6d0a04abb4e43f--