Not just Cayenne, Hans. No ORM efficiently handles the scale you are
talking about. You need to find a way to break your query down into
smaller chunks to process. What you are doing might be workable with
50k records, but not 2.5m. Find a way to break your query down into
smaller units to process or explore what Andrus suggested with
ResultIterator:
http://cayenne.apache.org/doc/iterating-through-data-rows.html
If you can loop over one record at a time and process it (thereby
letting the garbage collector clean out the ones you have processed)
then your memory usage should be somewhat stable and manageable, even
if the initial query time takes a while.
mrg
On Fri, Nov 13, 2009 at 7:09 AM, Hans Pikkemaat
<h.pikkemaa..si-solutions.nl> wrote:
> Anyway, my conclusion is indeed: don't use cayenne for large query
> processing.
This archive was generated by hypermail 2.0.0 : Fri Nov 13 2009 - 09:38:58 EST