jackrabbit-oak-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Julian Reschke (Jira)" <j...@apache.org>
Subject [jira] [Comment Edited] (OAK-8912) Version garbage collector is not working if documents exceeded 100000
Date Wed, 06 May 2020 12:10:00 GMT

    [ https://issues.apache.org/jira/browse/OAK-8912?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17100663#comment-17100663
] 

Julian Reschke edited comment on OAK-8912 at 5/6/20, 12:09 PM:
---------------------------------------------------------------

1) 1.10.2 is outdated; please use the latest release from the current maintenance branch 1.22
(1.22.3 right now)

2) If you can reproduce this, please attach sufficient code so that we can as well. Optimally,
see the existing VersionGC tests and see why they pass (what's the difference compared to
your setup?)

3) The exception indicates that a connection wasn't properly closed; setting the system property
"org.apache.jackrabbit.oak.plugins.document.rdb.RDBConnectionHandler.CHECKCONNECTIONONCLOSE"
to "true" might give you better diagnostics.


was (Author: reschke):
1) 1.10.2 is outdated; please use the latest release from the current maintenance branch 1.22
(1.22.3 right now)

2) If you can reproduce this, please attach sufficient code so that we can as well. Optimally,
see the existing VersionGC tests and see why they pass (what's the difference to you setup?)

3) The exception indicates that a connection wasn't properly closed; setting the system property
"org.apache.jackrabbit.oak.plugins.document.rdb.RDBConnectionHandler.CHECKCONNECTIONONCLOSE"
to "true" might give you better diagnostics.

> Version garbage collector is not working if documents exceeded 100000
> ---------------------------------------------------------------------
>
>                 Key: OAK-8912
>                 URL: https://issues.apache.org/jira/browse/OAK-8912
>             Project: Jackrabbit Oak
>          Issue Type: Bug
>          Components: documentmk
>            Reporter: Ankush Nagapure
>            Priority: Major
>         Attachments: exception.txt
>
>
> Oak version - 1.10.2, PostgreSQL 10.7 (10.7), using driver: PostgreSQL JDBC Driver 42.2.2
(42.2).
> *Actual :*
> After below code run, if document collectlimit exceeded 100000, it throws exception attached
in .txt
>   {color:#0747a6}public static void runVersionGC() { {color}
>  {color:#0747a6}          log.info("Running garbage collection for DocumentNodeStore");{color}
>  {color:#0747a6}          try {{color}
>  {color:#0747a6}                 final VersionGCOptions versionGCOptions = new
VersionGCOptions();{color}
>  {color:#0747a6}                *versionGCOptions.withCollectLimit(1000000);*{color}
>  {color:#0747a6}                *documentNodeStore.getVersionGarbageCollector().setOptions(versionGCOptions);*{color}
>  {color:#0747a6}                 log.info("versionGCOptions.collectLimit : "
+ versionGCOptions.collectLimit);{color}
>  {color:#0747a6}                documentNodeStore.getVersionGarbageCollector().gc(0,
TimeUnit.DAYS);{color}
>  {color:#0747a6}           } catch (final DocumentStoreException e) {{color}
>  {color:#0747a6}             //{color}
>  {color:#0747a6}     }{color}
> Below is the code to create repository and get documentNodeStore object for version garbage
collection.
>    {color:#0747a6}private static Repository createRepo(final Map<String, String>
dbDetails){color}
>  {color:#0747a6}             throws DataStoreException {{color}
>  {color:#0747a6}       try {{color}
>  {color:#0747a6}             final RDBOptions options ={color}
>  {color:#0747a6}                new DBOptions().tablePrefix(dbDetails.get(DB_TABLE_PREFIX)).dropTablesOnClose({color}
>  {color:#0747a6}                   false);{color}
>  {color:#0747a6}            final DataSource ds ={color}
>  {color:#0747a6}            RDBDataSourceFactory.forJdbcUrl({color}
>  {color:#0747a6}                 dbDetails.get("dbURL"),{color}
>  {color:#0747a6}                 dbDetails.get("dbUser"),{color}
>  {color:#0747a6}                 dbDetails.get("dbPassword"));{color}
>  {color:#0747a6}          final Properties properties = buildS3Properties(dbDetails);{color}
>  {color:#0747a6}          final S3DataStore s3DataStore = buildS3DataStore(properties);{color}
>  {color:#0747a6}          final DataStoreBlobStore dataStoreBlobStore = new DataStoreBlobStore(s3DataStore);{color}
>  {color:#0747a6}          final Whiteboard wb = new DefaultWhiteboard();{color}
> {color:#0747a6}         bapRegistration ={color}
>  {color:#0747a6}                         wb.register(BlobAccessProvider.class,(BlobAccessProvider) 
           {color}
> {color:#0747a6}                           dataStoreBlobStore,properties);{color}
> {color:#0747a6}         *documentNodeStore =*{color}
>  {color:#0747a6}                *new RDBDocumentNodeStoreBuilder()*{color}
>  {color:#0747a6}                    *.setBlobStore(dataStoreBlobStore)*{color}
>  {color:#0747a6}                    *.setBundlingDisabled(true)*{color}
>  {color:#0747a6}                    *.setRDBConnection(ds, options)*{color}
>  {color:#0747a6}                    *.build();*{color}
> {color:#0747a6}           repository = new Jcr(documentNodeStore).with(wb).createRepository();{color}
>  {color:#0747a6}          return repository;{color}
>  {color:#0747a6}      } catch (final DataStoreException e) {{color}
>  {color:#0747a6}               log.error("S3 Connection could not be created."
+ e);{color}
>  {color:#0747a6}              throw new DataStoreException("S3 Connection could
not be created");{color}
>  {color:#0747a6}      }{color}
>  {color:#0747a6}  }{color}
> {color:#172b4d}Even after setting collectLimit in code, still it is taking 100000 as
limit.{color}
> {color:#172b4d}Expected :{color}
> versionGCOptions.collectLimit should set to custom value to avoid DocumentStoreException
or solution to avoid DocumentStoreException if documents exceeded to 100000.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Mime
View raw message