[Date Prev][Date Next] [Chronological] [Thread] [Top]

RE: Thoughts on simple paged search results



At 10:55 PM 2002-01-05, Howard Chu wrote:
>How should a feature like this work in the presence of search limits?

>Suppose you have a limit of 50 entries per search configured; right now
>any search that returns more than 50 candidates will be aborted.
>If a paged search is aborted the same way, then there's no benefit to the
>paging feature. If it is allowed to continue in 50-entry pages, then
>it seems the intent of the restriction has been subverted.

There are three different limits:
        a) user requested size limits
        b) administrative specified size limits
        c) administrative candidate size limits

a) if the page size is greater than the size limit, the
size limit will be exceeded before paging.  Kind of a
stupid request.

b) if the page size is greater than the admin size limit,
the size limit will be exceed before paging.  Here, the
user can then lower paging size and use this to obtain
all the entries.  Note that even without this, the
client likely could walk the tree or use other means
to obtain all the entries.

c) if the candidate limit is exceeded, the request is
refused.  paging cannot be used to get around this limit.
However, a client could walk the tree or use other means
to obtain all the entries.

>As far as creating some kind of cookie to maintain search state - perhaps it
>would be simpler just to maintain a lastid counter and let each backend
>regenerate the candidate list each time. Keeping ID lists hanging around seems
>a bit dodgy. The backends could add an internal filter clause with (e->e_id >
>lastid) to resume the search.

Assume we limit clients to one (or maybe two) page controls
at a time, then I think it better to maintain the candidate
list then to rebuild the list for each page.

Since most uses of this control will be users, I assume that
the user will be under some administrative candidate size
restriction, case (c).