i think that this is a dataContext-issue. every commited object stays in the context, so you probably should create a new dataContext at some point.
-----Ursprüngliche Nachricht-----
Von: Jean-Paul Le Fèvre [mailto:jean-paul.lefevr..ea.fr]
Gesendet: Montag, 23. Juli 2007 15:30
An: use..ayenne.apache.org
Betreff: Out of memory exception when creating a large number of objects
Hi,
I'm trying to import a pretty big amount of data into my database.
The input is a xml formatted file. It describes more than 10 millions
of objects each having tens of attributes. The application parses the input
file, creates the cayenne objects and commits the changes if requested.
As you can imagining I'm facing difficulties trying to avoid out of memory
errors. Unfortunately, at this point, I'm still unable to load my big
input file.
To figure out what it's happening I'm monitoring the application behavior
with jconsole. My tactic is the following : every 10000 objects (this number
is a parameter) I call rollbackChanges() or commitChanges().
When I run the program in rollback mode It turns out that the memory used
oscillates between a min and a max value as expected : after each rollback
the garbage collector feels free to cleanup the memory.
But in commit mode the amount of memory keeps on increasing and the
application fails eventually.
I've tried to call unregisterNewObjects() and startTrackingNewObjects() after
the commit :
ctxt.commitChanges();
ctxt.getObjectStore().unregisterNewObjects();
ctxt.getEntityResolver().clearCache();
ctxt.getObjectStore().startTrackingNewObjects();
but it didn't help. It seems that cayenne keeps reference of newly created
objects somewhere preventing the gc from working.
Would you have an idea how to fix the problem ?
Thanks,
-- ___________________________________________________________________Jean-Paul Le Fèvre * Mail : LeFevr..onteny.org
This archive was generated by hypermail 2.0.0 : Mon Jul 23 2007 - 09:52:58 EDT