tomcat-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From d...@lexmark.com
Subject Re: JKStatus Bug?
Date Tue, 16 May 2006 17:42:26 GMT
Hi Rainer,

Thanks for the reply.

As far as configuration change suggestions, how about making things more
fine-grained, so you can specify the worker within the balancer - eg:

    worker.adminloadbalancer.BLUFF.disabled=1

Presumably something like that is happening within jkstatus?

cheers,

David
x54680


|---------+---------------------------->
|         |           Rainer Jung      |
|         |           <rjung@apache.org|
|         |           >                |
|         |                            |
|         |           05/16/2006 01:36 |
|         |           PM               |
|         |           Please respond to|
|         |           "Tomcat          |
|         |           Developers List" |
|         |                            |
|---------+---------------------------->
  >------------------------------------------------------------------------------------------------------------------------------|
  |                                                                                      
                                       |
  |       To:       Tomcat Developers List <dev@tomcat.apache.org>                 
                                             |
  |       cc:                                                                            
                                       |
  |       Subject:  Re: JKStatus Bug?                                                    
                                       |
  >------------------------------------------------------------------------------------------------------------------------------|



Hi,

it's true, that jkstatus doesn't persist changes. There is no
functionality there to write a workers.properties (it's somewhere near
the end of the TODO).

Concerning disabled: Yes, disabled at the moment is an attribute
belonging to a worker and when using stickyness for any jvmRoute you can
only have one worker.

So if you want to use a worker in several balancers with different
enable/disable or start/stop values, the workers.properties gives you no
way to configure that.

Any ideas how such a configuration could look like? If the idea looks
good, i might implement :)

In case you only have further user questions, please proceed on
users@tomcat.apache.org.

Concerning improvment of configuration syntax in workers.properties this
thread is right.

Regards,

Rainer

P.S.: local_worker and local_worker_only does no longer exist since some
thime before 1.2.15. The attributes are being ignored.

dhay@lexmark.com wrote:
> Hi,
>
> We're using 3 load-balancers to seperate our requests up (client and
admin
> etc.) and numerous tomcat servers (we're running Apache in front).
>
> We need to be able to disable servers on a load-balancer level - so I
need
> to disable 2 of the 3 load balancers on a particular server, say.  We can
> do this fine using the jkstatus page, but when the machines are
restarted,
> the changes don't seem to have been persisted.
>
> And it seems that workers.properties is not fine-grained enough to handle
> this?
>
> Our setup is below...
>
> Any ideas how to get around this?
>
> cheers,
>
> David
>
>
> mod-jk.conf snippet:
>
> JKMount /framework/admin/* adminloadbalancer
> JKMount /framework/httpadaptor/* adaptorloadbalancer
> JkMount /framework/* clientloadbalancer
>
> # if you wanted to only load-balance a sub-context, you could
> # map the module differently, such as:
> # JkMount /myContext/* loadbalancer
>
> JkMount /status/* jkstatus
>
>
>
>
> workers.properties:
>
> worker.LAUREL.type=ajp13
> worker.LAUREL.lbfactor=1
> worker.LAUREL.cachesize=25
> worker.LAUREL.port=8009
> worker.LAUREL.host=LAUREL.mw.prtdev.lexmark.com
>
> worker.BLUFF.type=ajp13
> worker.BLUFF.lbfactor=1
> worker.BLUFF.cachesize=25
> worker.BLUFF.port=8009
> worker.BLUFF.host=BLUFF.mw.prtdev.lexmark.com
>
>
> worker.adminloadbalancer.type=lb
> worker.adminloadbalancer.method=B
> worker.adminloadbalancer.sticky_session=1
> worker.adminloadbalancer.sticky_session_force=1
> worker.adminloadbalancer.local_worker_only=1
> worker.adminloadbalancer.balanced_workers=BLUFF,LAUREL
>
> worker.clientloadbalancer.type=lb
> worker.clientloadbalancer.method=B
> worker.clientloadbalancer.sticky_session=1
> worker.clientloadbalancer.sticky_session_force=1
> worker.clientloadbalancer.local_worker_only=1
> worker.clientloadbalancer.balanced_workers=BLUFF,LAUREL
>
> worker.adaptorloadbalancer.local_worker_only=1
> worker.adaptorloadbalancer.type=lb
> worker.adaptorloadbalancer.method=B
> worker.adaptorloadbalancer.sticky_session=1
> worker.adaptorloadbalancer.sticky_session_force=1
> worker.adaptorloadbalancer.balanced_workers=BLUFF,LAUREL
>
> worker.jkstatus.type=status
>
worker.list=jkstatus,adminloadbalancer,clientloadbalancer,adaptorloadbalancer

>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@tomcat.apache.org
> For additional commands, e-mail: dev-help@tomcat.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@tomcat.apache.org
For additional commands, e-mail: dev-help@tomcat.apache.org




---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@tomcat.apache.org
For additional commands, e-mail: dev-help@tomcat.apache.org


Mime
View raw message