Out of memory exception when creating a large number of objects

From: Jean-Paul Le Fèvre (jean-paul.lefevr..ea.fr)
Date: Mon Jul 23 2007 - 09:29:35 EDT

  • Next message: Aristedes Maniatis: "Re: Out of memory exception when creating a large number of objects"

    Hi,

    I'm trying to import a pretty big amount of data into my database.
    The input is a xml formatted file. It describes more than 10 millions
    of objects each having tens of attributes. The application parses the input
    file, creates the cayenne objects and commits the changes if requested.

    As you can imagining I'm facing difficulties trying to avoid out of memory
    errors. Unfortunately, at this point, I'm still unable to load my big
    input file.

    To figure out what it's happening I'm monitoring the application behavior
    with jconsole. My tactic is the following : every 10000 objects (this number
    is a parameter) I call rollbackChanges() or commitChanges().

    When I run the program in rollback mode It turns out that the memory used
    oscillates between a min and a max value as expected : after each rollback
    the garbage collector feels free to cleanup the memory.

    But in commit mode the amount of memory keeps on increasing and the
    application fails eventually.
    I've tried to call unregisterNewObjects() and startTrackingNewObjects() after
    the commit :

    ctxt.commitChanges();
    ctxt.getObjectStore().unregisterNewObjects();
    ctxt.getEntityResolver().clearCache();
    ctxt.getObjectStore().startTrackingNewObjects();

    but it didn't help. It seems that cayenne keeps reference of newly created
    objects somewhere preventing the gc from working.

    Would you have an idea how to fix the problem ?
    Thanks,

    -- 
    ___________________________________________________________________
    

    Jean-Paul Le Fèvre * Mail : LeFevr..onteny.org



    This archive was generated by hypermail 2.0.0 : Mon Jul 23 2007 - 09:30:15 EDT