Re: OutOfMemory with large number of inserts

From: Juergen Saar (juerge..saar.org)
Date: Mon May 24 2010 - 07:21:46 UTC

  • Next message: Gary Jarrel: "Re: Maven2 cgen task problem"

    First try: use a new ObjectContext for each csv-File ...

    If this doesn't solve the memory-problem you should try to make a commit
    after let's say 200 inserts and then create a new ObjectContext ...

    2010/5/24 Steve Wells <websystem..mail.com>

    > Hi All,
    >
    > I am importing from a lot of csv files (2200+) and each csv has between 0
    > and 3000 rows, each row will create a new DataObject in a DataContext etc :
    >
    > The code is simple enough and distilled here is:
    >
    > for each csv file
    > RelatedObj rel = getRelatedObj()
    > for each line in csv file
    > Line l = new Line()
    > l.setxxx for each column of csv // 5 cols are set
    > l.setToRelatedObject(rel)
    > dataContext.registerNewObject(l)
    > end for each line in csv file
    > dataContext.commitChanges()
    > end
    >
    > This goes well for a while but no matter how high I set the heap size I
    > eventually get OutOfMemoryException....I can see the speed of each commit
    > slowdown over time as the heap size is diminshed.
    >
    > I have tried a few different ways around this but all end up the same.
    > Initially tuning the SQL server instance, but everything points to objects
    > in memory not being de-allocated.
    >
    > Has anyone had experience with Cayenne (3.0RC3) and this number of inserts
    > they could advise with?
    >
    > Cheers,
    >
    > Steve
    >



    This archive was generated by hypermail 2.0.0 : Mon May 24 2010 - 07:22:20 UTC