OutOfMemoryError: reading a large number of objects one by one

From: Tomi N/A (hefes..mail.com)
Date: Mon May 14 2007 - 13:57:01 EDT

  • Next message: Andrus Adamchik: "Re: OutOfMemoryError: reading a large number of objects one by one"

    I'll try to be to the point.
    I need:
    - to "stream" data from potentially large tables
    - wan't the app to not use more than a couple of MB of RAM, as
    (theoretically) there shouldn't be a reason for it to need any more

    Using cayenne 1.2 something (no, switching to something a bit more
    modern is not a high priority...yet).

    Have tried with paginated query, failed.
    Have tried with dc.performIteratedQuery(...), failed.
    Have tried with a combination, failed.
    Have tried various values of cayenne.DataRowStore.snapshot.size (10^3
    to 10^5), failed.
    Have not tried combining a low DataRowStore.snapshot.size value with
    an iterated/paginated query.

    This was the last (failed) attempt:

    Expression condition = ExpressionFactory.matchExp(MyClassA.TO_B_PROPERTY, b);
                    SelectQuery query = new SelectQuery(MyClassA.class, condition);
                    query.setFetchLimit(10000);
                    query.setPageSize(100);

                    ResultIterator ri;
                    try {
                            ri = Util.getCommonContext().performIteratedQuery(query);
                            MyClassA mca;
                            while (ri.hasNextRow()) {
                                    DataRow dr = (DataRow) ri.nextDataRow();
                                    mca = (MyClassA)
    Util.getCommonContext().objectFromDataRow(SpisUtil.class, dr, false);
                                    
                                    System.out.println(mca.getAttr1().toString());
                                    System.out.println(mca.getToMyClassC().getAttr3().toString());
                            }
                    } catch (CayenneException ex) {
                            ex.printStackTrace();
                    }

    The fetch limit is only a temporary measure: I used it to deduce that
    I run out of memory somewhere between a 1000 and 10000.
    Now...I see no reason why it would be impossible to have a snippet of
    code not unlike this one (working with a hand full of objects at any
    given time) work with an arbitrarily large dataset and use no more
    than 1MB of RAM, but I'll be generous and say it's great if it works
    using under, say, 30MB. :)

    Opinions, hints, suggestions, possibilities?

    TIA,
    t.n.a.



    This archive was generated by hypermail 2.0.0 : Mon May 14 2007 - 13:57:42 EDT