Hi,
I'm using cayenne to store large files in BLOBs as a process runs.
The first step of the process is storing large files (~ 600MB) and they
are ending up in the DB just fine, then we run some tasks and get some
output files, and then store the large output files (~ 500MB) to the DB.
The output files are not making it into the DB. In fact it appears that
the whole program is just sitting and waiting, for what, i have no idea
and after you try and spawn another thread in the program it throws an out
of memory exception. I was trying to figure out why the larger input
files got persisted fine, but the large output files cause a problem and
the only thing I could think of was that when the BLOBs are created they
are cached in the DataContext and are never cleared eventually just
causing the memory to be exhausted. Is this possible? Anything else
anyone can think of?
note: I'm also compressing the stream in memory as I'm adding it to the
byte[], but still... it works for the input files. also, each of these
phases of the process is followed by a commit, so all the input files are
committed together and all the output files should be committed together
as well, but this never happens.
Thank you for any help you may be able to provide.
-Mike
This archive was generated by hypermail 2.0.0 : Fri May 21 2010 - 21:27:50 UTC