This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
Hey guys, I'm having a wierd issue with something that should be pretty straight forward. I have an invoice, which in turn has an entity collection of spider summaries. Our new billing engine "asserts" summaries on to invoices, so if multiple billing schemes apply, 1 billing class won't know others have added a summary to an invoice. As such I've created a simple method named "AddOrUpdateSpiderSummary". Below is an example of my test code, as well as my method. When the 3rd summary is added, index 0 is returned. I have 2 elements at index 0 and index 1. However, when the removeAt is called, it removes both index 0 and 1 when I only pass it index 0. This seems incorrect, as 0 and 1 both point to 2 different instances of an object in memory. Any ideas why this is happening?
thanks, Todd
</p><p> /// <summary> Test execution </p><p> public void TestTotalSummation() |
|
|
Hi Todd, Unfortunately our horrible forum software has mangled your code to the point where I'm not able to follow it. Would you be able to repost the code? You can post it as an attachment via the Options tab, which should get around whatever gremlins have decided to infest Community Server today... |
|
|
Let's try this again. Sorry for the junk code, I couldn't edit my post. I've attached my code with comments. Check out the Invoice class AddOrUpdateSummary method. I've commented a few approaches I've tried which don't seem to work. It seems to be an issue with the EntityState being new, so it doesn't have a Unique Guid from the DB. When I remove any element with EntityState.New, it seems all elements in that state are removed from the collection, not the specific index I've referenced.
thanks, Todd |
|
|
Thanks for re-posting, Todd, and sorry for the inconvenience. We have a known bug with adding New entities to an EntityCollection and then removing them again before saving -- it sounds like this is the same issue. The workaround is, if you know that you are going to have to be both adding and removing child items within the course of the unit of work, to maintain a separate "net list of items to add," on which you can add Add and Remove to your heart's content; then when you have finished adding and removing, add all the survivors to the real collection. We realise this is somewhat inconvenient, but it's a bit unusual to be creating, adding and removing the same item in a single unit of work, because it's effectively a no-op. |
|
|
Hi Ivan, I agree it is a bit outside of normal persistance for something like a web app. I've encountered this sort of issue when using a Rules based system before. Each rule is encapsulated within a class, and the caller has no idea how a rule has modified the existing object graph. It would remove items, add multiple new, or even overwriate a previous new value.
To me there seems to be a disconnect in the remove logic. From my layman view as a LS user, an Entity in a collection is really defined in two different use cases. If the entity state is new, the entity in a collection should be identified by it's memory address, or business logic invoked via the Equals method. If the entity state is any other state, we know it's been assigned a generated PK by the database, and can use it for removing and adding entities.
Will this be addressed in LS3? I completely agree it's a no-op and a bit of an edge case in the context of persistence. However, I'd prefer to not have to modify my business logic to be persistence aware. I simply want to pass it an object graph, and let it perform updates and deletes on my object graph, then persist whatever is the result of the operation.
Thanks for the speedy reply as always! Todd
|
|
|
It's not really that there are two use cases: entities in a unit of work always have an ID, even if it's only a temporary one, and they also have a .NET object identity. So we really do always have reliable ways of referencing an entity, even one in the New state. In fact, the bug arises because of the way we automatically adding and removing entities from a unit of work. Here's the internals. Each entity knows its own unit of work, and tries to propagate that to associated entities so that the entire graph remains in the same unit of work. (This why you can add children to a parent without having to explicitly add them to the parent's UOW.) But internally the way we represent removing a new entity is to set its unit of work to null. And this is where the bug comes in: the entity sees that its unit of work is being set, and tries to propagate the new UOW to associated entities. But the new UOW is null, which results in other new entities taking themselves out of the unit of work. I can't promise that this will be fixed in LS3: we've discussed ways of fixing this before, but haven't yet found a completely satisfactory solution (because null UOW propagation is sometimes desirable). I know it has bitten several customers, though, so I think we will have to tackle it sooner or later; but again, I can't promise that this will happen in time for LS3. Sorry. |
|
|
Cool. That makes sense, thanks for the explanation. In light of this, I'll need to perform a deep copy/merge of my object graph. Given that my collections could be quite large, I don't want to iterate over them to find my entry. In looking at the source, I see that your Set used within the EntityCollection is backed by a Dictionary for O(1) access. This is exactly the performance I need, BUT I need to search this dictionary based on a business key, not the database guid when performing a merge. I've run into issues with overriding Equals on my entities before. What approach would you recommend for keeping EntityCollections in a dictionary, but keyed by something besides the database PK so I can quickly locate them? |
|
|
Use HashSet or Dictionary, but implement IEqualityComparer<T> to compare on business keys, and pass an instance of your custom equality comparer to the HashSet or Dictionary constructor. This will give you O(1) performance with lookup based on the business key rather than the PK. |
|
|
Sorry, my question wasn't the clearest. A better question would have been the following.
Can I pass my own equality comparator to an entity collection without the need to maintain a duplicate hashset or dictionary?
|
|
|
Sorry for misunderstanding. No, you can't pass your own equality comparer to an entity collection. An EntityCollection is actually a BindingList, not a Set; the inner Set merely performs some additional bookkeeping, and requires reference equality semantics. |
|
|
Hey Ivan, Yet another question :) After some prototyping, I've decided the best course of action is to create a subclass of EntityCollection named BusinessEntityCollection. This collection keeps a dictionary pointer to entities which are hashed and compared via the IEqualityComparer instace passed to the constructor. It also keeps a seperate List of entities that are in the EntityState.New state, and does not add them to the underlying EntityCollection Set until save is called. This allows me to keep objects that could potentially result in a no-op via add and remove seperate from the persisted set. This is working well, it allows me to treat my collections of entities in my Model as simple sets, without worrying if they've been peristed or not. However, I've run into one final snag. I can't seem to find any Events that are an equivalent to an on save event. Right before the data is serialized to be persisted, I want to move everything from my transient List of new Entities into the Set in the entity collection. Do you know of any way I can encapsulate this logic within my BaseEntityCollection class (I want to avoid tying the model's OnSave event to the collection in my code)? Note that I'm not adding this to your source, but rather in a seperate dll, so I don't have access to the interal functions in the code.
Thanks, Todd |
|
|
There isn't an OnSaving for an EntityCollection, because entity collections don't participate in saving: only their constituent entities participate in saving. There are a couple of ways around this, neither very pretty: 1. In your custom collection class, override InsertItem to subscribe to the Saving event on the item being added. When the collection receives this Saving event, it would copy itself into the real collection. Since you may receive Saving from multiple items, you will need to track "have I already copied myself." Some care and testing is required here because depending on the save order you may receive Saving too late for the copied collection to be picked up. (You should be okay because the entity raising Saving is the entity that needs to be updated with the right FK, so it hasn't saved yet; but test!) Also, this depends on New entities having been added to the unit of work (so that they get on the save list), although not to the parent's collection. 2. Override UnitOfWork.SaveChanges(bool) so that it forces all collections to copy before calling base.SaveChanges (which actually does the save). However, it's not obvious to me how to do this in a way that is clean and doesn't pollute the derived unit of work class with knowledge of the domain model; I guess entities with child collections could register those child collections in AfterLoad or something, but definitely not pretty. Hope this helps, or at least sparks some better ideas...! |
|
|
Aaugh! Todd, I hate to say this after going through all these workarounds, but I happened to be reviewing the changelogs for some other purpose and it looks we committed a fix for this issue (or at least a closely related one) a few weeks ago. If you are on a nightly build prior to 10 September 2009, you might want to upgrade to the latest nightly and see if that fixes the problem. I really apologise for overlooking this and putting you through all this grief. |
|
|
Hi Ivan, It's not the end of the world, I've learned a lot about LS internals. Is the latest nightly stable enough for production use? I'm sure you know we're using this in a ciritical real time application, so I have to be 100% sure it's as reliable as the 2.2 release.
Thanks, Todd |
|
|
Hi Ivan I just tried the new dll, and it doesn't appear to be fixed. If I call "Remove(Entity e)" with any of the 3 different new instances that have been added, I'm still getting all 3 removed. |
|
|
The current nightlies are stable enough for production use as far as we know. One of the things that doesn't happen with nightlies is load or performance testing, so it is possible we've introduced a regression in those areas, and I'd advise performing your own load tests before deploying it -- but since your application is "critical real time," I'm guessing you already do this as part of your release process so hopefully this wouldn't be a big deal. In terms of stability, we believe that the nightly is as stable as RTM, probably more reliable in core areas because it's had more shaking down. Obviously, new features and recent bug fixes represent risk areas, but the bug fix for "removing a new entity unregisters siblings" has been in the nightlies for over more than six weeks and appears to be stable, and even the most recent changes have been out there for more than three weeks with no known issues (though admittedly I don't have figures for how many people are using the most recent builds and exercising those latest fixes; I do know that older nightlies *are* in use on highly trafficked sites). And needless to say, the nightly passes all the tests that 2.2 RTM did (and a fair number more!). In summary, I would have high confidence in the stability of the nightly, but I would advise performance testing in any high-load performance-critical scenario. I hope that's enough info for you to judge the risk; if you need more details let me know. |
|
|
Okay, we'll investigate -- thanks for letting us know, and I guess it's back to the workarounds in the meantime then... |
|
|
Hi Ivan, I'm working on creating my own build of LS 2.2. Rather than modify the existing codebase and risk breaking something, I'm adding a new class, similar in behaviour and structure to ThroughAssociation. I'd like to run all the existing unit/integration tests and create some new ones for my new class. Do you have any documentation on getting the test environment set up and running?
Thanks, Todd |
|
|
Hi Todd, We don't have documentation on setting up a testing environment for the LightSpeed source as it's provided for reference only and not for customisation. Aside from this, we would be unable to provide support to customers who have altered the source. I'd reconsider hacking around to make your scenario work and suggest that if you could provide us with a simple repro of the issue we'll fix it in a nightly build. We've resolved a similar issue some time back relating to Remove() so it's surprising that it's broken for RemoveAt(). If you could create a basic repro and fire it off to me with the binaries stripped out we'll take a look. This way you don't have to worry about losing support, won't have issues upgrading to future versions, won't break the license agreement and should be singing and dancing in no time :-) I hope that helps, John-Daniel Trask |
|
|
Hey JD, Check out the attached zip on the second post. It has both my Entities as well as their corresponding tests for adding and removing from the set.
thanks, Todd |
|
|
Hi! We have at least three different problems right now with LightSpeed. We will try to create test cases for them. Two of them are hard to reproduce since they only occur when running on a 64-bit server and the second problem arises occasionally for unknown reasons yet. With the second issue we strongly suspect the O/R mapper, since we log the entity and what should be persisted, but one of it's child objects are sometimes not stored to the database, although when inspecting the entity it should be. The third one is easily reproducable and is related to inserting and deleting from an entitycollection, we will try to create a test case for it. The latest nightly build doesn't help fixing the problem. So, I was a bit surprised to hear there was a known bug in the O/R mapper related to this. Do you have a list of know bugs? That could save us a great deal of work. Best regards Björn Andersson Admeta AB
|
|
|
We do maintain a list of known bugs, but it's not in a readily shareable state. We'd have some work to do to make it available (and would probably need to consider some commercial issues e.g. who we would make it available to). It's also not clear to me whether this would add significant benefit over forums search (if we improved forums search so that it worked properly that is). This is something we could look at in the new year, but not before then, and I can't promise it's something we'll do even then. Anyway, I've, er, logged a bug for this. Re your second issue, one possible cause for errors of this kind is when an Attach "overwrites" an entry in the unit of work. I don't know if that's relevant to your architecture, but I thought I'd mention it as a possible avenue of investigation. See http://www.mindscape.co.nz/forums/Thread.aspx?ThreadID=2596 for info if you think this might be a possibility. |
|