OutOfMemory with large number of inserts

From: Steve Wells (websystem..mail.com)
Date: Mon May 24 2010 - 06:58:45 UTC

  • Next message: Andrey Razumovsky: "Re: OutOfMemory with large number of inserts"

    Hi All,

    I am importing from a lot of csv files (2200+) and each csv has between 0
    and 3000 rows, each row will create a new DataObject in a DataContext etc :

    The code is simple enough and distilled here is:

    for each csv file
       RelatedObj rel = getRelatedObj()
       for each line in csv file
          Line l = new Line()
          l.setxxx for each column of csv // 5 cols are set
          l.setToRelatedObject(rel)
          dataContext.registerNewObject(l)
       end for each line in csv file
       dataContext.commitChanges()
    end

    This goes well for a while but no matter how high I set the heap size I
    eventually get OutOfMemoryException....I can see the speed of each commit
    slowdown over time as the heap size is diminshed.

    I have tried a few different ways around this but all end up the same.
    Initially tuning the SQL server instance, but everything points to objects
    in memory not being de-allocated.

    Has anyone had experience with Cayenne (3.0RC3) and this number of inserts
    they could advise with?

    Cheers,

    Steve



    This archive was generated by hypermail 2.0.0 : Mon May 24 2010 - 06:59:17 UTC