Re: large data loads - best practices ?

From: Andrus Adamchik (andru..bjectstyle.org)
Date: Thu Apr 24 2003 - 15:25:09 EDT

  • Next message: Troy McKinnon: "Re: large data loads - best practices ?"

    From the subject I assume that you are doing some kind of offline batch
    processing? Or is this a web application with lots of sessions that share
    some big areas of common read-inly data?

    The solution will depend on the answer :-)

    BTW, Arndt, who is using Cayenne for batch processing of huge volumes of
    data, is currently working on an example application that will be included
    in the future releases of Cayenne. He will be showing exactly that - how
    to keep memory usage at bay, no matter how many rows you have to read
    through.

    In short, ResultIterator and DataContext.unregisterObjects() are your
    friends:

    http://objectstyle.org/cayenne/userguide/perform/index.html#iterator
    http://objectstyle.org/cayenne/api/cayenne/org/objectstyle/cayenne/access/DataContext.html#unregisterObject(org.objectstyle.cayenne.DataObject)

    Andrus

    > Having a bit of trouble with cayenne being very memory intensive...
    >
    > We are using it to load staging tables (lots of data) and have run into
    > a lot of trouble around memory.
    >
    > We are running on a pentium 4 with 1Gig of memory... and by allocating
    > 512 to the jvm it helps somewhat.
    >
    > I was just wondering if there is a best practice for doing this type of
    > process.
    >
    > I.e. committing 1 row at a time, clearing context each iteration maybe,
    > getting a whole new context each iteration?
    >
    > Any suggestions for performance tuning would be appreciated.
    >
    > Troy



    This archive was generated by hypermail 2.0.0 : Thu Apr 24 2003 - 15:25:10 EDT