jackrabbit-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shah, Sumit (CGI Federal)" <Sumit.S...@cgifederal.com>
Subject RE: Jackrabbit 2.2.5 - loss of data [SEC=UNCLASSIFIED]
Date Tue, 20 Dec 2011 15:43:39 GMT
Thanks Ross. It seems like the content is present on the filesystem. I can see the old documents
in the repository/datastore folders. But the link between the Jackrabbit metadata (ex: path)
and the content seems to be broken. Any reason on why this would happen?

Does Jackrabbit use UUIDs internally to store the metadata and the content itself?


From: Ross.Dyson@ipaustralia.gov.au [mailto:Ross.Dyson@ipaustralia.gov.au]
Sent: Monday, December 19, 2011 9:20 PM
To: users@jackrabbit.apache.org
Cc: users@jackrabbit.apache.org
Subject: Re: Jackrabbit 2.2.5 - loss of data [SEC=UNCLASSIFIED]

This looks suspiciously like a problem I have had before, where somebody writes a script to
delete files that look like temp files, no file extensions, over a month old.  I had one that
was deleting classes created at runtime, so each morning there was a good chance of getting
classloader errors.

Best of luck.

From:        "Shah, Sumit (CGI Federal)" <Sumit.Shah@cgifederal.com>
To:        "users@jackrabbit.apache.org" <users@jackrabbit.apache.org>
Date:        20/12/2011 11:58 AM
Subject:        Jackrabbit 2.2.5 - loss of data

Hi All,

I am running into a serious issue. It seems like I am unable to retrieve documents from Jackrabbit
that are more than a month old. I get the following error:

"JCR Action 'Get stream' cannot be performed because the provided path does not exist"

I am running Jackrabbit in standalone mode and also in a clustered environment. I am seeing
the same issue on both. When does this happen? Is there a self initiated process that cleans
up the data within Jackrabbit? What are the possible resolutions to this?

I would appreciate any help on this.


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message