Re: OutOfMemory with large number of inserts

From: Steve Wells (websystem..mail.com)
Date: Mon May 24 2010 - 10:32:17 UTC

  • Next message: Andrus Adamchik: "Re: Maven2 cgen task problem"

    Typical Cayenne users exceedingly fast helpful answers! I've tried a few
    more things based on your replies but nothing positive so far.

    Andrey, I am only creating *most* new objects in datacontext. But I am
    selecting the related (master a.k.a 'rel') data-objects and "connecting"
    them to the detail objects in the order of 1-2000 (could be 1-1 or 1-3000).
    one thing I did notice was that I was selecting the 'rel' object using
    shared cache. I set cache to NONE for that select and all the
    create/inserts ran approx twice as fast (I do have some basic timings in
    place). Perhaps this is an interesting separate thread of discussion.

    Juergen, yes I already had a new ObjectContext for each CSV and I was
    commiting each 400 (and 200 and 1000 etc). I also created a new
    ObjectContext at the same point, though I expect moving the 'related' object
    to the new Context may negate the point of creating the new context...so
    same result.

     Andrus, I've deleted the reverse relationship and the same issue persists.
    What I did do was remove the forward relationship
    'l.setToRelatedObject(rel)' and it all worked fine...no OutOfMemory. But
    I do need both relationship directions. At least I have some direction in
    which to try.

    Thanks all for your help.

    On 24 May 2010 17:21, Juergen Saar <juerge..saar.org> wrote:

    > First try: use a new ObjectContext for each csv-File ...
    >
    > If this doesn't solve the memory-problem you should try to make a commit
    > after let's say 200 inserts and then create a new ObjectContext ...
    >
    > 2010/5/24 Steve Wells <websystem..mail.com>
    >
    > > Hi All,
    > >
    > > I am importing from a lot of csv files (2200+) and each csv has between 0
    > > and 3000 rows, each row will create a new DataObject in a DataContext etc
    > :
    > >
    > > The code is simple enough and distilled here is:
    > >
    > > for each csv file
    > > RelatedObj rel = getRelatedObj()
    > > for each line in csv file
    > > Line l = new Line()
    > > l.setxxx for each column of csv // 5 cols are set
    > > l.setToRelatedObject(rel)
    > > dataContext.registerNewObject(l)
    > > end for each line in csv file
    > > dataContext.commitChanges()
    > > end
    > >
    > > This goes well for a while but no matter how high I set the heap size I
    > > eventually get OutOfMemoryException....I can see the speed of each commit
    > > slowdown over time as the heap size is diminshed.
    > >
    > > I have tried a few different ways around this but all end up the same.
    > > Initially tuning the SQL server instance, but everything points to
    > objects
    > > in memory not being de-allocated.
    > >
    > > Has anyone had experience with Cayenne (3.0RC3) and this number of
    > inserts
    > > they could advise with?
    > >
    > > Cheers,
    > >
    > > Steve
    > >
    >



    This archive was generated by hypermail 2.0.0 : Mon May 24 2010 - 10:32:50 UTC