AW: out of memory

From: Peter Schröder (Peter.Schroede..reenet-ag.de)
Date: Tue Mar 06 2007 - 11:51:59 EST

  • Next message: marco turchi: "Re: out of memory"

    i experienced the same problem. i thinks this was because of using

    objectFromDataRow()

    which puts the object into the dataRow-cache and will result in out-of-memory errors.

    as a workaround i used the raw-object/data-row object from the query.
     

    -----Ursprüngliche Nachricht-----
    Von: marco turchi [mailto:marco.turch..mail.com]
    Gesendet: Dienstag, 6. März 2007 17:48
    An: use..ayenne.apache.org
    Betreff: out of memory

    Dear experts,
    I run a code, where I read 800.000 records from a database, but what I
    obtain is the error "out of memory: heap....".
    I use the following cayenne commands:

                Expression exp_date = Expression.fromString("dateRetrieval >=
    $start_d and dateRetrieval < $end_d and type=$type");//("type=$type");
                Map parameters =new HashMap();
            parameters.put("end_d", "2006-03-09" );
            parameters.put("start_d", "2005-10-07" );
            parameters.put("type", "Job");
            exp_date = exp_date.expWithParameters(parameters);
            final SelectQuery feedsquery = new SelectQuery(FeedsAll.class,
    exp_date);
            int count=0;
            try{
                            
                    final ResultIterator it_q = context.performIteratedQuery(feedsquery);
                       while(it_q.hasNextRow()){
                    
                                  final Map row = it_q.nextDataRow();
                               final FeedsAll obj = (FeedsAll)
    context.objectFromDataRow(FeedsAll.class, new DataRow(row), true);
                          .....
                          context.deleteObject(obj);
                     }
            }
            catch(Exception e){
                //System.out.println("Fatal Error: "+e);
                log.error("Fatal Error: ",e);
                log.info(e.getStackTrace());
            }

    what is it wrong? I understood that using the performIteratedQuery, I
    could read a huge number of records without memory problem.
    Can you help me?

    Thanks a lot
    Marco



    This archive was generated by hypermail 2.0.0 : Tue Mar 06 2007 - 11:52:35 EST