subversion-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ryan Schmidt <subversion-2...@ryandesign.com>
Subject Re: Simultaneous svnserve requests lead to "Can't read from connection: Connection reset by peer"
Date Sat, 03 May 2014 08:37:16 GMT

On May 2, 2014, at 11:29, Philip Martin wrote:

> Ryan Schmidt writes:
> 
>> Is there a limit on the number of simultaneous connections I can make
>> from a single svn client to svnserve?
> 
> No, there is no rate limiting at all in 1.8 svnserve: it will attempt to
> start threads/processes as fast as incomming connections arrive.  This
> can cause the server to run out of resources which may in turn cause
> processes to get killed.  However your test case of 10 simultaneous
> exports isn't very big so I'd not expect a problem.
> 
>> The client and server are on a local network connected by 100mbit
>> Ethernet. Both are 64-bit Intel Macs running OS X 10.9.2 with
>> Subversion 1.8.8 installed using MacPorts. The server is started with
>> a shell script containing:
>> 
>> #!/bin/sh
>> svnserve --daemon --foreground --read-only --log-file /dev/stdout --client-speed
100 --root "$(dirname "$0")"
> 
> You say you want it to be faster.  If svnserve is the bottleneck then
> using thread mode rather than fork mode and increasing the size of the
> FSFS cache will make svnserve much faster, your network may well become
> the bottleneck.
> 
> http://subversion.apache.org/docs/release-notes/1.7.html#server-performance-tuning
> http://subversion.apache.org/docs/release-notes/1.8.html#fsfs-enhancements


Hi Philip, thanks for the suggestions.

It’s hard to tell whether using threads, enabling all the caching, and increasing the memory
cache to 512MB have really helped.


Does the cache expire after a time? Or is there anything else in the server that would get
reset after a period of inactivity? It seems odd to me, but what I seem to be observing is
that when I initially run the exports, some of them fail with the “Connection reset by peer”
error. Then, if I repeat the same exports a few times, they work fine. Then if I wait several
minutes and try again, some of them fail again.


I also once got:

svn: E000024: Can't open file '/path/to/svn/db/current': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/current': Too many open files
svn: E000024: Unable to connect to a repository at URL 'svn://example.local/trunk/foo'
svn: E000024: Can't open file '/path/to/svn/format': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/revs/87.pack/pack': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/revs/84.pack/manifest': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/revs/112.pack/pack': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/revs/94.pack/pack': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/revs/114.pack/pack': Too many open files
svn: E000024: Can't open file '/path/to/svn/db/revs/112.pack/pack': Too many open files

I should mention this is a packed repository. I created it within the past couple weeks by
svnsyncing a public repository, then packing it.

I realize OS X’s default maximum number of open files in the shell is 256; adding “ulimit
-n 4096” to the svnserve start script should make that problem less likely to occur.


My start script is now:

#!/bin/sh
ulimit -n 4096
svnserve --daemon --threads --foreground --read-only --log-file /dev/stdout --memory-cache-size
512 --cache-fulltexts yes --cache-revprops yes --cache-txdeltas yes --client-speed 100 --root
"$(dirname "$0")"


I don’t know if svn is a bottleneck for my script, but my script needs to do several tasks
(do several exports, modify some of the files, then analyze them; repeat), and since I have
a multicore machine, I wrote the script to do things in parallel, because I didn’t ever
want the CPUs to be idle waiting for things to be downloaded from the repository.

The reason I thought to run multiple svn exports simultaneously is that web browsers open
multiple connections to web servers to increase performance.

Perhaps I should rewrite the script to only do the local tasks in parallel and run only one
svn export at a time since I haven’t seen any errors when doing that.

I tried a couple public svnserve-hosted repositories and could not reproduce the issue. Don’t
think there’s anything terribly special about the repo with which I’m seeing the problem,
but it has been around for awhile and does have > 100,000 revisions. The other repos I
tested didn’t have that many revisions, and I don’t know if they were packed or what version
of svnserve they were running or with what options.



Mime
View raw message