AW: Deleting a data row?

From: Peter Schröder (Peter.Schroede..reenet-ag.de)
Date: Thu Mar 08 2007 - 05:09:32 EST

  • Next message: Peter Karich: "Store BitSet"

    i think that there are different aproaches to this issue. you might use a custom query to delete the entry. or perhaps something like this:
    http://cayenne.apache.org/doc20/api/cayenne/org/apache/cayenne/query/DeleteBatchQuery.html i didnt try that one jet.

    kind regards,
    peter
     

    -----Ursprüngliche Nachricht-----
    Von: Török Péter [mailto:torok..llround.net]
    Gesendet: Donnerstag, 8. März 2007 10:51
    An: use..ayenne.apache.org
    Betreff: Deleting a data row?

    Hello,
    I have a related question.
    Is it possible to delete a record using a data row, without having to instantiate first a data object via context.objectFromDataRow() (as it is seen in the code example below)?
    If possible, how?
    Thanks in advance,
    Péter

    -----Original Message-----
    From: Andrus Adamchik [mailto:andru..bjectstyle.org]
    Sent: Tuesday, March 06, 2007 5:59 PM
    To: use..ayenne.apache.org
    Subject: Re: AW: out of memory

    Peter is right in his assessment that DataContext would cache all its
    loaded objects (especially since they get in a dirty state via
    'deleteObject'). To solve this problem, you may use a loop counter
    and commit and throw away the DataContext every X iterations (e.g.
    every 1000 or so). This will clear the memory.

    On top of that, if you want all 800000 deletes to be processed as a
    single transaction, you may do the above, but also wrap that code in
    a manual transaction:

    http://cayenne.apache.org/doc/understanding-transactions.html

    Andrus

    On Mar 6, 2007, at 6:51 PM, Peter Schröder wrote:
    > i experienced the same problem. i thinks this was because of using
    >
    > objectFromDataRow()
    >
    > which puts the object into the dataRow-cache and will result in out-
    > of-memory errors.
    >
    > as a workaround i used the raw-object/data-row object from the query.
    >
    >
    > -----Ursprüngliche Nachricht-----
    > Von: marco turchi [mailto:marco.turch..mail.com]
    > Gesendet: Dienstag, 6. März 2007 17:48
    > An: use..ayenne.apache.org
    > Betreff: out of memory
    >
    > Dear experts,
    > I run a code, where I read 800.000 records from a database, but what I
    > obtain is the error "out of memory: heap....".
    > I use the following cayenne commands:
    >
    > Expression exp_date = Expression.fromString("dateRetrieval >=
    > $start_d and dateRetrieval < $end_d and type=$type");//("type=$type");
    > Map parameters =new HashMap();
    > parameters.put("end_d", "2006-03-09" );
    > parameters.put("start_d", "2005-10-07" );
    > parameters.put("type", "Job");
    > exp_date = exp_date.expWithParameters(parameters);
    > final SelectQuery feedsquery = new SelectQuery(FeedsAll.class,
    > exp_date);
    > int count=0;
    > try{
    >
    > final ResultIterator it_q = context.performIteratedQuery
    > (feedsquery);
    > while(it_q.hasNextRow()){
    >
    > final Map row = it_q.nextDataRow();
    > final FeedsAll obj = (FeedsAll)
    > context.objectFromDataRow(FeedsAll.class, new DataRow(row), true);
    > .....
    > context.deleteObject(obj);
    > }
    > }
    > catch(Exception e){
    > //System.out.println("Fatal Error: "+e);
    > log.error("Fatal Error: ",e);
    > log.info(e.getStackTrace());
    > }
    >
    > what is it wrong? I understood that using the performIteratedQuery, I
    > could read a huge number of records without memory problem.
    > Can you help me?
    >
    > Thanks a lot
    > Marco
    >



    This archive was generated by hypermail 2.0.0 : Thu Mar 08 2007 - 05:09:57 EST