subversion-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Shahaf <>
Subject Re: "svnadmin load" a huge file
Date Fri, 07 Jan 2011 17:58:23 GMT
Les Mikesell wrote on Fri, Jan 07, 2011 at 10:37:12 -0600:
> On 1/7/2011 7:57 AM, Victor Sudakov wrote:
>>>>> I migrated a large CVS repository (25-50 GB) to SVN years ago on SVN
>>>>> 1.3.  Our repo had many sections (projects) within it.  We had to
>>>>> migrate each project independently so that it's team could coordinate
>>>>> when they migrated to SVN.  As such, I dumped each project when ready
>>>>> and then svnadmin loaded each dump into it's own path/root (so as not
>>>>> overwrite anything previously loaded and unrelated to this project's
>>>>> import).
>> It would be fine if the project in question did not contain almost all
>> the files in one directory. You may call the layout silly, but CVS does
>> not seem to mind. OTOH, I would have distributed the files over
>> several subdirectories, but CVS does not handle moving files well.
>> I wonder if cvs2svn is to blame that it produces a dump svnadmin
>> cannot load. Or I am always risking that "svnadmin dump" may one day
>> produce a dump a subsequent "svnadmin load" will be unable to swallow?
>> I mean, if by hook or by crook, by using third party utilities like
>> svndumptool, I will eventually be able to convert this project from
>> CVS to SVN. Is there a chance that a subsequent dump will be again
>> unloadable?
> I don't think you are hitting some absolute limit in the software here,  
> just running out of RAM on your particular machine.  Can you do the  
> conversion on a machine with more RAM?

I believe there are known issues with memory usage in svnadmin.  See the
issue tracker.

I don't know cvs2svn, but it could have a --sharded-output option, so eg
it would produce a dumpfile per 1000 revisions, rather than one huge

> -- 
>   Les Mikesell

View raw message