incubator-couchdb-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexander Shorin <kxe...@gmail.com>
Subject Re: A million databases
Date Tue, 25 Mar 2014 18:38:03 GMT
On Tue, Mar 25, 2014 at 10:27 PM, Jens Alfke <jens@couchbase.com> wrote:
> On Mar 25, 2014, at 12:41 AM, Suraj Kumar <suraj.kumar@inmobi.com> wrote:
>
>> If there are a million "*.couch" files under var/lib/couchdb/, I'd expect the
>> performance to be very poor / unpredictable since it now depends on the
>> underlying file system's logic.
>
> Do modern filesystems still have performance problems with large directories? I’m sure
none of them are representing directories as linear arrays of inodes anymore. I’ve been
wondering if this is just folk knowledge that’s no longer relevant.

The most part of issues comes from tools that aren't able to operate
with such amounts of files effectively. That's ruins all usability of
having billion files in single directory.

Also: http://events.linuxfoundation.org/slides/2010/linuxcon2010_wheeler.pdf

As for Windows...never try to open a directory with thousands files
inside with the default file manager called Explorer.


--
,,,^..^,,,

Mime
View raw message