large data loads - best practices ?

From: Troy McKinnon (tmckinno..rise.com)
Date: Thu Apr 24 2003 - 15:07:15 EDT

  • Next message: Andrus Adamchik: "Re: large data loads - best practices ?"

    Having a bit of trouble with cayenne being very memory intensive...

    We are using it to load staging tables (lots of data) and have run into a
    lot of trouble around memory.

    We are running on a pentium 4 with 1Gig of memory... and by allocating 512
    to the jvm it helps somewhat.

    I was just wondering if there is a best practice for doing this type of
    process.

    I.e. committing 1 row at a time, clearing context each iteration maybe,
    getting a whole new context each iteration?

    Any suggestions for performance tuning would be appreciated.

    Troy



    This archive was generated by hypermail 2.0.0 : Thu Apr 24 2003 - 15:08:15 EDT