Cayenne has to fetch the primary keys of your ~683k records first,
which is why it is taking so long. After that, it will use the PKs to
fetch all the records for each page (50 at a time in your case) you
access. Eventually you'll have all ~683k in memory (if you have
enough memory). This will be horribly inefficient. You should try to
find a way to divide your data set up into smaller chunks that you can
process.
mrg
On Sun, Apr 12, 2009 at 8:42 PM, Paul Logasa Bogen II <pl..amu.edu> wrote:
> I have a table with ~683k entries in it. I have a maintenance task that
> requires me to hit every entry. Since I know the results are much too big to
> return all at once, I've set the page size to 50. However, Cayenne appears
> to be attempting the full query first before falling back to paging. This is
> taking nearly half an hour for it to fall back, at which point everything
> works flawlessly. Is there a way to force the query to be paginated or to
> reduce the timeout on a query?
>
> plb
>
This archive was generated by hypermail 2.0.0 : Sun Apr 12 2009 - 20:53:10 EDT