This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
Hi.. I have a situation where I create a new unit of work and then in a foreach loop create a new entity and add it to the unit of work - unit.Add(entity)... After the foreach loop exits, I do unit.SaveChanges() and unit.Dispose()... The problem is that I need to insert a fairly large number of records ( 2000 or so )....what happens is that my memory usage skyrockets after a while and the program just crashes due to OutOfMemoryException.... I came to the conclusion this is due to the fact that I create a new entity every iteration and since Entity does not implement IDisposable it just does't get released... When I tried creating only a single entity object outside the foreach loop and then setting its properties and adding it to the unit of work I get an exception saying that the entity had already been added...I guess it receives the same ID as the one before ( this happens after only two iterations ).... I am using an oracle database, IdentityMethod = Sequence....I know have everything set up properly because I can insert records without any problem if I have a smaller dataset to operate on... I am sure, or at least hope, there is a simple solution because 2000 records is really not that much.... Any help, or suggestions how to tweak the code a bit will be greatly appreciated... Thanks
|
|
|
2000 entities shouldn't be a problem. What's your UpdateBatchSize? Very large batch sizes can cause a problem like this (see http://www.mindscape.co.nz/blog/index.php/2008/09/18/saving-large-numbers-of-entities-in-lightspeed/). Also, are you using 2.2 RTM or a current nightly build? If you're using RTM, could you try installing the nightly and see if that improves matters? If your UpdateBatchSize is the default and the nightly build doesn't help, could you create a small but complete console project that demonstrates the problem and post it for us to investigate? (You can attach a zip file via the Options tab, or mail to ivan @ the obvious domain name. Please remove all binaries first.) Please also include the CREATE TABLE definition for the entity. Thanks! In the meantime, a possible workaround is to do a SaveChanges after every (say) 50 entities, but wrap the whole loop in a transaction to guarantee atomicity. A couple of comments for additional background. - You do not need to dispose entities because they don't hold scarce resources. - The reason you can't keep reusing the same entity is because entities have reference semantics. If you create two Persons named "Jim" and "Bob", that's different from creating a Person named "Jim" and then changing its name to "Bob". When you add the same entity the second time around the loop, LightSpeed knows it's the same entity instance, even though the properties have changed. |
|
|
Thanks Ivan for your response. You've been a great help, again... Turns out that the problem was not caused by LightSpeed at all, even though all of our diagnostics implied that it may have been in fact caused by LightSpeed.... I guess we couldn't see the forest for the trees.....your comments about Jim and Bob got me thinking in another direction, which in turn lead us to the real problem, again, unrelated to LightSpeed... Thanks again for your prompt reply...
|
|