Re: Blobs in the DataContext

From: Tore Halset (halse..vv.ntnu.no)
Date: Tue May 25 2010 - 11:57:34 UTC

  • Next message: Bob Schellink: "Removing Cayenne cache groups"

    Hello.

    I tried to implement support for streaming blobs back in 2006. Sorry, but it never got completed. I still think it is a nice feature. If you want to work on this issue, you might want to take a look at streaming_blob_try2.zip from https://issues.apache.org/jira/browse/CAY-316

    Regards,
     - Tore.

    On 21. mai 2010, at 23.27, MGargan..scholar.com wrote:

    > Hi,
    >
    > I'm using cayenne to store large files in BLOBs as a process runs.
    > The first step of the process is storing large files (~ 600MB) and they
    > are ending up in the DB just fine, then we run some tasks and get some
    > output files, and then store the large output files (~ 500MB) to the DB.
    > The output files are not making it into the DB. In fact it appears that
    > the whole program is just sitting and waiting, for what, i have no idea
    > and after you try and spawn another thread in the program it throws an out
    > of memory exception. I was trying to figure out why the larger input
    > files got persisted fine, but the large output files cause a problem and
    > the only thing I could think of was that when the BLOBs are created they
    > are cached in the DataContext and are never cleared eventually just
    > causing the memory to be exhausted. Is this possible? Anything else
    > anyone can think of?
    >
    > note: I'm also compressing the stream in memory as I'm adding it to the
    > byte[], but still... it works for the input files. also, each of these
    > phases of the process is followed by a commit, so all the input files are
    > committed together and all the output files should be committed together
    > as well, but this never happens.
    >
    > Thank you for any help you may be able to provide.
    > -Mike



    This archive was generated by hypermail 2.0.0 : Tue May 25 2010 - 11:58:14 UTC