lucy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao Wu <echowu...@gmail.com>
Subject Re: [lucy-user] Chinese support?
Date Tue, 21 Feb 2017 04:18:50 GMT
Hi Peter,

Thanks for reply.

That could be a problem. But probably not in my case.

I removed the old index.

run the program with 'ChineseAnalyzer' and truncate => 0  twice. the second
time, will give me the error.

'body' assigned conflicting FieldType
        LUCY_Schema_Spec_Field_IMP at cfcore/Lucy/Plan/Schema.c line 124
        at /home/hwu/perl5/lib/perl5/x86_64-linux-gnu-thread-multi/Lucy.pm
line 118.
        Lucy::Index::Indexer::new('Lucy::Index::Indexer', 'index',
'/home/hwu/data/lucy/mitbbs.index', 'schema',
'Lucy::Plan::Schema=SCALAR(0x1c56798)', 'create', 1) called at
mitbbs_index.pl line 26

run the program with 'ChineseAnalyzer' and truncate => 0  twice, no error.
but I want to update the index.

run the program with 'StandardTokenizer', with  truncate 0 or 1, both work
fine.

So, this make me think I must miss something in the 'ChineseAnalyzer' I
have.




On Mon, Feb 20, 2017 at 6:47 PM, Peter Karman <peter@peknet.com> wrote:

> Hao Wu wrote on 2/20/17 6:12 PM:
>
>> Still have problem when I try to update the index using the custom
>> analyzer.
>>
>> If I comment out the
>>    truncate => 1
>>
>> rerun I got the following errror.
>>
>>
>> 'body' assigned conflicting FieldType
>>         LUCY_Schema_Spec_Field_IMP at cfcore/Lucy/Plan/Schema.c line 124
>>         at /home/hwu/perl5/lib/perl5/x86_64-linux-gnu-thread-multi/Lucy
>> .pm
>> line 118.
>>         Lucy::Index::Indexer::new('Lucy::Index::Indexer', 'index',
>> '/home/hwu/data/lucy/mitbbs.index', 'schema',
>> 'Lucy::Plan::Schema=SCALAR(0x211c758)', 'create', 1) called at
>> mitbbs_index.pl line 26
>> *** Error in `perl': corrupted double-linked list: 0x00000000021113a0 ***
>>
>> If I switch the analyzer to  Lucy::Analysis::StandardTokenize.  works
>> fine.
>> a new seg_2 is created.
>>
>> my $tokenizer = Lucy::Analysis::StandardTokenizer->new;
>> my $raw_type = Lucy::Plan::FullTextType->new(
>>         analyzer => $tokenizer,
>> );
>>
>> So I guess I must miss something in the custom Chinese Analyzer.
>>
>>
> since you changed the field definition with a new analyzer, you must
> create a new index. You cannot update an existing index with 2 different
> field definitions in the same schema.
>
>
>
> --
> Peter Karman  .  https://peknet.com/  .  https://keybase.io/peterkarman
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message