Re: OutOfMemoryError: reading a large number of objects one by one

From: Derek Rendall (derek.rendal..mail.com)
Date: Mon May 14 2007 - 19:11:53 EDT

  • Next message: Tomi N/A: "Re: OutOfMemoryError: reading a large number of objects one by one"

    OK, my memory on this stuff is now going back a year or two, but I did do
    some extensive playing around with exactly this scenario. My example died
    about 6-7 thousand records in 64 Megs - I found out why, and it seemed
    pretty reasonable reasons at the time (something to do with a data row being
    inflated to an object and then being cached in the object store as an
    object). As I had some time back then I ended up creating a version of
    Cayenne that handled pretty large data sets (I only needed 100 thousand
    records to be handled), by making some relatively minor adjustments. I did
    discuss some of the techniques on the mailing lists but I can't seem to find
    the entries now. I did find a Jira issue:
    http://issues.apache.org/cayenne/browse/CAY-294

    Try doing the following every 100 records or so (BTW: not sure if this stuff
    actually still around :-) YMMV:

                    getDataContext().getObjectStore().unregisterNewObjects();
                    getDataContext().getObjectStore().startTrackingNewObjects();

    Hope that helps



    This archive was generated by hypermail 2.0.0 : Mon May 14 2007 - 19:12:26 EDT