This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
I'm using VistaDB and DevExpress. I need to delete multiple rows (thousands). I see that SaveChanges takes forever. I noticed that also adding many records takes forever. How can I batch multiple operations? Thanks. Andrea |
|
|
LightSpeed automatically batches multiple operations if the database engine supports it. You can configure how many statements go in each batch using the LightSpeedContext.UpdateBatchSize (the default is 10), though in my tests this didn't seem to make an awful lot of difference. You can see the batching by setting LightSpeedContext.Logger = new TraceLogger() -- LightSpeed emits a "time taken" indicator after each batch. From my testing I get the impression that the slowness is actually in with the VistaDB engine. I have tried doing thousands of inserts using raw ADO.NET and the performance seems to be similar to that of LightSpeed. However these were only preliminary tests so I may be missing something. I did see more of a difference with deletes. One issue to watch out for with deletes is that optimistic concurrency checking defeats batching -- that is, if optimistic concurrency checking (LockVersion) is turned on, then deletes will not be batched. Could this be the problem? If you are doing a bulk delete then the IUnitOfWork.Remove(Query) method allows you to blow away entities by specification (e.g. Entity.Attribute("LastActive") < archiveDate) rather than having to load them and individually delete them. |
|
|
Thanks for your suggestions. I enabled the logging and I discovered that currently when I delete 100 items, for example, 100 calls to the database are done. I would like to make just one call with the Ids of all the items to delete. What's the proper way to do that? If I need to do bulk inserts? Is there a way to speed things up? Thanks a lot. Andrea |
|
|
Use IUnitOfWork.Remove(Query), with the QueryExpression set to an In expression e.g. uow.Remove(new Query(typeof(Widget), Entity.Attribute("Id").In(ids))); Please be sure to read the documentation on IUnitOfWork.Remove(Query) because this will not update entity status in the UOW. There's no way to do bulk inserts other than issuing multiple INSERT statements. However LightSpeed batches INSERT, UPDATE and DELETE statements to minimise round-trips to the database. You can customise batching using LightSpeedContext.UpdateBatchSize, though you should measure performance carefully because the behaviour may not be what you expect. Also, using the IdentityColumn identity method defeats batching: if you want to perform efficient bulk inserts then use KeyTable instead (and consider increasing the IdentityBlockSize). |
|
|
I'm sorry but this wasn't very clear to me. To delete the items in bulk, here's what I did (I'm using VistaDB): The primary key of the table is called CommandId in the database but it's mapped to Id in lightspeed (if I'm not mistaken). //delete the elements where Id is in ids
When I try to save changes, here's the exception that I get. SystemException saying Value is of unknown data type (I'm using long values as ids). How can I fix this? Thanks.
|
|
|
You are probably passing a long[] to In(). In is declared as taking "params object[]" so when C# sees a long[], it goes "well, that's not an object[], so it must be a one-element params array where the one element is an array of longs." The fix is to type the array as object[] instead of long[]: this can easily be done using ConvertAll. long[] idvals = /* get array of longs from somewhere */; Appreciate this is confusing behaviour but it's partly a C# language issue -- we'll look into whether we can improve the behaviour but the above solution will see you right for now. |
|
|
Thanks for the suggestion. I tried what you said and the items are actually deleted from the database. I still have two issues: 1. How can I refresh the data displayed to keep in account the new situation of the database? I have a devexpress grid that is attached to the data and after the delete operation I still see the old data 2. What are the performance that I can expect? I tried removing about 5.000 records and it took between 20-30s on VistaDB. This is really a long time. Is that normal? (I hope not). Thanks a lot. |
|
|
It sounds like you have a database issue. I just deleted 5000 rows out of a VistaDB 4 database using the query we've been discussing, and it took about 4 seconds. Could you be missing a primary key or other index that would let VistaDB rapidly locate rows? I also captured the LightSpeed query and re-ran it using raw ADO.NET, and that also took... about 4 seconds. So as I mentioned before, almost all the time appears to be being spent in VistaDB, not in LightSpeed. For comparison, I then went back to your original approach of deleting the entities using Remove(Entity), and then doing a SaveChanges to commit all the deletes. This also took about 4 seconds. Doing 5000 DELETE commands using ADO.NET also took about 4 seconds. So I don't think that LightSpeed is adding significant overhead, and that the performance issue is at the database end rather than the LightSpeed end. Regarding refreshing the data, since it appears from my testing that using 5000 Remove(Entity) calls is not in fact significantly slower than using the Remove(Query) call, I would recommend going back to that. This will result in all deleted entities changing their EntityState to Deleted as they are removed (note that they are NOT removed from the result list or the unit of work). Or you could just requery the database after doing the SaveChanges() -- this gets you a nice fresh unit of work and ensures that the user does not see stale data. |
|
|
Thanks for the info. I checked my database and I have the proper index on the primary key in place but it takes ages to perform the operation. I've also tried rebuilding the index and packing the database but the results are the same. Can I send you my database to check or can you send me the database you used for your testing? This is a really strange thing. Thanks. |
|
|
One more thing. I have a parent-child relationship in the database (and I'm deleting the child). Could this be the cause of the problem? Thanks |
|
|
Before we look at your database, could you try re-running the LightSpeed-generated SQL from raw ADO.NET? If you see similar performance from ADO.NET as from LightSpeed, then it is a VistaDB problem, and obviously we can't offer support or diagnostics on VistaDB: your best bet is the VistaDB support forums. If, however, you see dramatically slower performance from LightSpeed, then we'd be happy to look at your database. Please also include a minimal but complete console project or NUnit test replicating the problem you're seeing, so that we can be sure we're testing the exact same code as you are! You can attach a zip file via the Options tab. Thanks! |
|
|
Where can I download your sample database to try? I would like to see if I can reproduce your numbers with your database. If I try deleting records from VistaDB Data Builder the performance are good (1500 records deleted in 1.5s). Thanks. |
|
|
I just created a trivial database to test with. I've attached it. |
|
|
When I try to download the file I get an error. |
|
|
Our over-protective forums software strikes again. Here's a zip file -- this should work. |
|
|
I can see it this time. Could you also provide me with the sample application you used for testing both ADO.NET and Lightspeed performance? Thanks. |
|
|
See attached zip file. QueryingTests.cs contains the NUnit test that I used for timing it in LightSpeed. Please be warned the file I'm sending you is stitched together from three files in our test suite so you may need to fix up some namespaces. Program.cs is the program I used to test using raw ADO.NET. It's for running as a console application. The SQL in Program.cs is pasted in from the SQL I logged in the NUnit tests. Let me know if you need any more info! |
|
|
I've built a sample application to test with my database. What I found is that using your database the performance of ADO.NET and Lightspeed are about the same. Using my database, with lightspeed the time it takes to perform the deletes almost doubles. Do you have an email address I can send you the test sample? (I prefer not to post anything on this public forum). Thanks. |
|
|
Hi Nagar, Could you please email it to support@mindscape.co.nz Please note that your zip file must not contain any binary files (.exe, .dll) - just the database file and any project files please as our email system will bounce them if they have executables in them. Kind regards, John-Daniel |
|