AW: updating large number of data

From: Peter Schröder (Peter.Schroede..reenet-ag.de)
Date: Tue Dec 05 2006 - 09:53:39 EST

  • Next message: Michael Gentry: "Re: AW: updating large number of data"

    we are deleting all rows with truncate table first. then loading cvs with load data infile.

    i would prefer not to use this method, but it is simple and fast.

    -----Ursprüngliche Nachricht-----
    Von: Michael Gentry [mailto:blacknex..mail.com]
    Gesendet: Dienstag, 5. Dezember 2006 14:38
    An: cayenne-use..ncubator.apache.org
    Betreff: Re: updating large number of data

    Are you deleting all of the original data and then doing inserts or
    are you doing updates?

    Thanks,

    /dev/mrg

    On 12/5/06, Peter Schröder <Peter.Schroede..reenet-ag.de> wrote:
    > hi,
    >
    > we get a cvs-file with a large number of user-data every hour. and we want to replace the existing data in our database with that. is there a best-practice to do something like this?
    > currently we are doing that with php an using mysql load data infile with the cvs-file. but i think that doing this with cayenne would leave the context in a bad state.
    >
    > hava a nice day,
    > peter
    >



    This archive was generated by hypermail 2.0.0 : Tue Dec 05 2006 - 09:55:14 EST