opennlp-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lance Norskog <goks...@gmail.com>
Subject Re: Multithreaded model use?
Date Sat, 19 May 2012 23:26:07 GMT
Right, I'm loading models and caching them. Then each active use makes
its own ME and detector/tokenizer/etc. The model object is just a
passive bucket. It is essentially "final".

As to JIRAs: I'm used to much larger projects and JIRA pollution
becomes a problem. It's not like this is Open Relevance :)

On Sat, May 19, 2012 at 3:21 AM, Jim - FooBar(); <jimpil1985@gmail.com> wrote:
> On 19/05/12 09:34, Jörn Kottmann wrote:
>>
>> On 05/19/2012 05:52 AM, Lance Norskog wrote:
>>>
>>> Are the Model classes (subclasses of
>>> opennlp.tools.util.model.BaseModel) guaranteed to be thread-safe? That
>>> is, can I re-use the same SentenceModel in two different
>>> SentenceDetectors and be sure there is no concurrency problems?
>>>
>>> If this is true, please add this to the BaseModel javadoc.
>>>
>>
>> They are thread safe. You are welcome to open a jira issue
>> and provide a patch to fix this.
>>
>> Jörn
>
>
> Hi Jorn,
>
> If they are thread-safe, what is there to fix? Are you just referring to the
> documentation?
>
> So, the models are thread-safe, but the detector,finder, tokenizer objects
> are not?
>
> Jim
>
>



-- 
Lance Norskog
goksron@gmail.com

Mime
View raw message