This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
Hi, I'm trying to enforce a unique value by creating a unique index in the database. Adding a duplicate value forces an SqlException, which I can catch and ensure data integrity. I'm not sure on what the standard practice is (I'm new to LightSpeed) and am running into issues with the in-memory representation not matching the database state. In essence, here is the test: Order order = new Order(); order.AddItem(item1); OrderItem item2 = new OrderItem(); using (var scope = new TransactionScope()) Assert.AreEqual(1, order.GetItems().Length); and here is the AddOrder method: public void AddItem(OrderItem item) unitOfWork.Attach(this); _orderItems.Add(item); unitOfWork.SaveChanges(); private readonly EntityCollection<OrderItem> _orderItems = new EntityCollection<OrderItem>(); SaveChanges() fires the exception as the query is run but 'item' still exists in _orderItems. How to I enforce its removal? Regards, Peter. |
|
|
Hi Peter, Take a look at the ValidateUnique attribute which lets you do this at the model level. e.g. [ValidateUnique] Cheers, Andrew. |
|
|
Hi Andrew, I've looked at that attribute. It appears to do a 'select count(*)' to check for uniquness. I would rather rely on the database validation using optimistic locking and the unique index, which will also be quicker. This garentees concurrent usage. I've attached an example project in my first post. Regards, Peter. |
|
|
Hi Peter, Usually, ValidateUnique will actually be more performant because you end up circumventing the flush process entirely and the count query is very fast. In your example case, however, ValidateUnique isn't much help because both OrderItems are new. That said this scenario is actually easier to validate model-side because we have all the information we need. Take a look at creating a custom validator or overriding OnValidate. Cheers, Andrew. |
|
|
Thanks Andrew, My example did have two 'new' entities and tis is probably not how I would actually use it. There would only be one added at a time. Maybe as a more generic question then, model-side validation cant garentee that I won't get duplicates unless I serialize access to the model using locking. This isn't possible, for instance, with multile instances of the model distributed on multiple computers. If two models 'simultaneously' add the same value, they'll both do a select count with a return value of '0' from the view of their database transaction. Then they'll both do an insert with one of them failing due to the database constraint. What should I do if I encounter an exception due to a database level constraint violation? It would appear that the model isn't rolled back to the same state as the database. Regards, Peter. |
|
|
[quote user="Peter"]What should I do if I encounter an exception due to a database level constraint violation? It would appear that the model isn't rolled back to the same state as the database.[/quote]
Correct. How to handle this is really an application-level concern. In order to re-sync an object with the database you can just re-query it - bear in mind though that by rolling back you will lose any unsaved state in that object. Also be wary of long-lived objects and overuse of Attach. For most scenarios the pattern is something like: 1. Begin unit of work So a simple approach to handling your scenario would be to: - Catch any database exceptions to provide a graceful UI experience. Cheers, Andrew. |
|
|
Ok, thanks very much. I'll give this a try. Regards, Peter. |
|