commons-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Greg Parker <>
Subject JCS LateralTCPCacheFactory
Date Thu, 27 Jul 2017 02:10:57 GMT
I'm trying to get a sample cluster of tomcat servers (server A, B and C) to use a distributed
cache.  The ‘hello’ cache is configured on each server with a different TcpListenerPort
on each server (8020, 8021, 8022) like this:



Server A has a page that sets a value in the cache like this:

        cache = JCS.getInstance("hello");
        cache.put(“message", “Hello from server A");

Server B, and C try to get the value without setting it:

        cache = JCS.getInstance("hello");
        add(new Label("message", "Server B -> cache has : " +  cache.get(“message");));

The first time I access the page on server A I get an exception -  "Cannot connect to localhost:8021”.
 If I access the page on each server then things appear to start communicating; however, the
“message” entry from the “hello” region is set on server A and null on B and C.  Sometimes
I see the value on server C displayed properly but never on server B.

With a configuration like this should I expect to see values immediately distributed to the
other servers?  

Is there a way I should be bootstrapping the cache when the server starts?  If the servers
need to communicate with one another it would seem to me that they need to be listening long
before the first time the server tries to access the cache.

For performance I would like to use the cache to store serialized XML data instead of storing
it in the session or a database.  The data remains in the cache while the user transforms
it through several request cycles.  So during a request cycle the XML data would be retrieved,
transformed, and stored back in the cache.  In a clustered environment I need the XML data
available on every server in the cluster.  Is this an acceptable use case for JCS?


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message