AW: AW: AW: updating large number of data

From: Peter Schröder (Peter.Schroede..reenet-ag.de)
Date: Wed Dec 06 2006 - 04:04:07 EST

  • Next message: Lothar Krenzien: "Re: how does cayenne handle java.util.date values ?"

    i might do something like this:

    - load data into a temp-table
    - compare with actual data
    - update changed data through cayenne

    this would keep the original table running during the data load and i wont have to invalidate any cache

    -----Ursprüngliche Nachricht-----
    Von: Michael Gentry [mailto:blacknex..mail.com]
    Gesendet: Dienstag, 5. Dezember 2006 18:23
    An: cayenne-use..ncubator.apache.org
    Betreff: Re: AW: AW: updating large number of data

    Yes, invalidateObjects() should do it. The trickier part is knowing
    when to do it and finding everything to invalidate. A good portion of
    that will depend on your application. The basic steps, though, will
    be:

    * Get notified of table update
    * Get a list of your DataContext objects (you might have to track
    those yourself, etc)
    * Iterate over each DataContext, and for each iterate over the objects
    inside of it (dataContext.getObjectStore().getObjectIterator()) and
    build up a collection of objects in the refreshed table, then call
    invalidateObjects() on that collection

    This might be a lot of overkill, though. Depends on your application.
     For example, can your application be shutdown while the refresh is
    happening? How much data is in those tables? It sounds like they are
    read-only and you might could just fetch everything again.

    /dev/mrg

    On 12/5/06, Peter Schröder <Peter.Schroede..reenet-ag.de> wrote:
    > yes, we have auto-increment on that table. how do i invalidate the caches? i know there is a invalidateObjects method, but is this the way to do it?
    >
    > -----Ursprüngliche Nachricht-----
    > Von: Michael Gentry [mailto:blacknex..mail.com]
    > Gesendet: Dienstag, 5. Dezember 2006 16:30
    > An: cayenne-use..ncubator.apache.org
    > Betreff: Re: AW: updating large number of data
    >
    > You know, sometimes simple and fast is a good way to do things. Do
    > you have an auto-increment PK in that table? Would be helpful. As for
    > Cayenne, can you can flush (invalidate) any active DataContexts (at
    > least the objects for that table) when the load occurs?
    >
    > /dev/mrg
    >
    >
    > On 12/5/06, Peter Schröder <Peter.Schroede..reenet-ag.de> wrote:
    > > we are deleting all rows with truncate table first. then loading cvs with load data infile.
    > >
    > > i would prefer not to use this method, but it is simple and fast.
    > >
    > > -----Ursprüngliche Nachricht-----
    > > Von: Michael Gentry [mailto:blacknex..mail.com]
    > > Gesendet: Dienstag, 5. Dezember 2006 14:38
    > > An: cayenne-use..ncubator.apache.org
    > > Betreff: Re: updating large number of data
    > >
    > > Are you deleting all of the original data and then doing inserts or
    > > are you doing updates?
    > >
    > > Thanks,
    > >
    > > /dev/mrg
    > >
    > >
    > > On 12/5/06, Peter Schröder <Peter.Schroede..reenet-ag.de> wrote:
    > > > hi,
    > > >
    > > > we get a cvs-file with a large number of user-data every hour. and we want to replace the existing data in our database with that. is there a best-practice to do something like this?
    > > > currently we are doing that with php an using mysql load data infile with the cvs-file. but i think that doing this with cayenne would leave the context in a bad state.
    > > >
    > > > hava a nice day,
    > > > peter
    > > >
    > >
    >



    This archive was generated by hypermail 2.0.0 : Wed Dec 06 2006 - 04:10:57 EST