lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Hostetter <>
Subject Re: multiple dateranges/timeslots per doc: modeling openinghours.
Date Tue, 11 Oct 2011 01:21:17 GMT

: Conceptually
: the Join-approach looks like it would work from paper, although I'm not a
: big fan of introducing a lot of complexity to the frontend / querying part
: of the solution.

you lost me there -- i don't see how using join would impact the front end 
/ query side at all.  your query clients would never even know that a join 
had happened (your indexing code would certianly have to know about 
creating those special case docs to join against obviuosly)

: As an alternative, what about using your fieldMaskingSpanQuery-approach
: solely (without the JOIN-approach)  and encode open/close on a per day
: basis?
: I didn't mention it, but I 'only' need 100 days of data, which would lead to
: 100 open and 100 close values, not counting the pois with multiple
: Data then becomes:
: open: 20111020_12_30, 20111021_12_30, 20111022_07_30, ...
: close: 20111020_20_00, 20111021_26_30, 20111022_12_30, ...

aw hell ... i assumed you needed to suport an arbitrarily large number 
of special case open+close pairs per doc.

if you only have to support a fix value (N=100) open+close values you 
could just have N*2 date fields and a BooleanQuery containing N 2-clause 
BooleanQueries contain ranging queries against each pair of your date 
fields. ie...

  ((+open00:[* TO NOW] +close00:[NOW+3HOURS TO *])
   (+open01:[* TO NOW] +close01:[NOW+3HOURS TO *])
   (+open02:[* TO NOW] +close02:[NOW+3HOURS TO *])
   (+open99:[* TO NOW] +close99:[NOW+3HOURS TO *]))

...for a lot of indexes, 100 clauses is small potatoes as far as number of 
boolean clauses go, especially if many of them are going to short circut 
out because there won't be any matches at all.

: Alternatively, how would you compare your suggested approach with the
: approach by David Smiley using either SOLR-2155 (Geohash prefix query
: filter) or LSP:
: That would work right now, and the LSP-approach seems pretty elegant to me.

I'm afraid i'm totally ignorant of how the LSP stuff works so i can't 
really comment there.

If i understand what you mean about mapping the open/close concepts to 
lat/lon concepts, then i can see how it would be useful for multiple pair 
wise (absolute) date ranges, but i'm not really sure how you would deal 
with the diff open+close pairs per day (or on diff days of hte week, or 
special days of the year) using the lat+lon conceptual model ... I guess 
if the LSP stuff supports arbitrary N-dimensional spaces then you could 
model day or week as a dimension .. but it still seems like you'd need 
multiple fields for the special case days, right?

How it would compare performance wise: no idea.


View raw message