jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ulrich <For...@gombers.de>
Subject Re: no space left on device-exception by Property.getStream()
Date Fri, 12 Apr 2013 13:29:58 GMT
Retrieving data is completely sequential, no concurrent processing at all. I changed the code
to session.logout() and session.connect() after every step, this didn't help.
So the code works like this:
while (String path : pathList) {
     Session session = ...
     Node currentNode = session.getNode(path);
     Node filenode = Node.getNode("jcr:content");
     Property jcrdata = filenode.getProperty("jcr:data");
     InputStream is = jcrdata.getBinary().getStream();
     is.close();
     session.logout();
}

To be honest, this is not the exact code; the logic is spread over two classes - but it shows
the effective data flow.
Nevertheless - the problem remains.
But when I retry the whole sequence later on, I get the same result - this means the buffer
has been cleared in the meantime.

It looks as if there is a kind of garbage collector, running asynchronously not fast enough
for avoiding the error but being done after a while. I tried to track the storage space by
'df -vk' but couldn't see a problem here. On Monday (I'm not in office right now) I will insert
a Thread.sleep(20000) to the workflow above to verify my theory.

Best regards,
Ulrich





Am 12.04.2013 um 10:13 schrieb Stefan Guggisberg <stefan.guggisberg@gmail.com>:

> On Fri, Apr 12, 2013 at 12:21 AM, Ulrich <Forums@gombers.de> wrote:
>> While retrieving lots of data in a loop from several nt:file nodes I always get a
"no space left on device"-exception. The code is:
>> Node filenode = Node.getNode("jcr:content");
>> Property jcrdata = filenode.getProperty("jcr:data");
>> InputStream is = jcrdata.getBinary().getStream();
>> It seems that the InputStream is buffered somewhere for the current session and that
the total buffer size for a session is limited. Is this true and if so, how can I control
this size? Or is there an opportunity to free the space? I can probably close my session and
open a new one but I would need to change the logic of my program,
>> 
>> Any hint is very welcome.
> 
> larger binaries are buffered in temp files on read (smaller ones are
> buffered in-mem).
> therefore, reading a lot of binaries concurrently will result in a lot
> of temp files.
> those temp files will go away once they're not referenced anymore.
> your obviously running out of disk space.
> 
> the following should help:
> 
> 1. make sure you close the input stream as early as possible
> 2. if this is a specific job you're running (such as e.g. an export) you could
>    try forcing gc cycles in between
> 3. increase your disk space
> 
> cheers
> stefan
> 
>> 
>> Ulrich

Mime
View raw message