This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
I asked about a optimize "WCF DataContract <> Model mapper" and you guys liked the idea :) What about generate also UnitTest (nunit) to test every class/column with references etc. Now I make it manually on the different "layers", but I could be nice have optimize tests to confirm the "datamodel/layer" is working correctly with your database model. |
|
|
I can't stop myself from butting in.. That's an excellent suggestion I think. I've been thinking of implementing this feature with generating standard CRUD unit tests dynamically to test the domain model and the database in order to get some fail early tests. It would be excellent to have this feature built in already into the ORM! We have some long running Selenium integration tests in our nightly build that automatically tests the web GUI every night. But we have had problems with that database changes not refelcted in the model get caught in these tests instead of other previous (unit) tests. These types of tests you really want to catch business related problems and not minor changes in the database since the tests take a long time to run. Fail early so to say. If tests could be generated to actually tests the consistance of the database and the domain model, it would be excellent. Naturally, a mock database has to be built before each testrun. We have that infrastructure in place already in our testing framework. Perhaps it is possible to build something database provider independent in the ORM that also does this? br, Tobias PS. I know that Lightspeed now has an excellent sync tool in the new Designer Support in v2. But if you have a more complex database that for instance are using ThroughAssociations, it is a bit difficult to use this tool I guess. |
|
|
[quote user="Tobias"]I can't stop myself from butting in.. That's an excellent suggestion I think. I've been thinking of implementing this feature with generating standard CRUD unit tests dynamically to test the domain model and the database in order to get some fail early tests. It would be excellent to have this feature built in already into the ORM! [/quote] YES :-) When Framework.ServiceModel go live, I hope the generate UnitTests for that to for the CRUD :) [quote user="Tobias"]PS. I know that Lightspeed now has an excellent sync tool in the new Designer Support in v2. But if you have a more complex database that for instance are using ThroughAssociations, it is a bit difficult to use this tool I guess[/quote] But the if you can have some kind of "Autosense" or "config mapping file" so you can also generate ThroughAssociations directly without manually. I have written a lot of different whises to lsgen.exe, what about make a "xml config" for features, mapping, nameconvention, excludes, includes, output, stripping etc...so you can regenerate your model like this lsgen.exe /c:mymodel.config and lsgen.exe /c:mymodel2.config etc... I'm just started to port one model with currently 124 tables from NHibernate to LightSpeed so I hope more features will be added to lsgen.exe (or the source code will be release) :-) |
|
|
This is an attractive idea and I'd be interested to hear what you would like to see generated by way of tests. What would you expect to see in the generated Read/SELECT tests, for example? Loading an object from the database and checking its property values? Against what -- values that you supply? How about Create/INSERT -- how far would this need to go -- would you be looking for us to look at the validation settings and create tests around the boundaries? Do you envisage test generation being a one-off "generate a skeleton for me to fill in" or a "hidden file owned by the designer" (like the generated model code)? I guess I also have a reservation that tests for generated code may not prove very much (especially when those tests are generated off the same model -- you're kind of testing the model against itself!). I like the idea of having your test suite verify that the code and database are in sync, but I am not sure how profitable it is to do much more than this. We're certainly interested in discussing it though, and getting a better picture of what you want the tests to do and where you see the value in them -- for example, Tobias, you obviously have specific pain points that you feel could be addressed this way -- please tell us more! |
|
|
[quote user="csa"]what about make a "xml config" for features, mapping, nameconvention, excludes, includes, output, stripping etc...so you can regenerate your model like this lsgen.exe /c:mymodel.config and lsgen.exe /c:mymodel2.config etc...[/quote] We did think about this but we kind of thought that if you were going to write a XML config file you could almost as easily write a batch file. We'll keep the idea in mind though, especially since a batch file might get pretty unreadable once you had a truckload of options on there! |
|
|
[quote user="ivan"] [quote user="csa"]what about make a "xml config" for features, mapping, nameconvention, excludes, includes, output, stripping etc...so you can regenerate your model like this lsgen.exe /c:mymodel.config and lsgen.exe /c:mymodel2.config etc...[/quote] We did think about this but we kind of thought that if you were going to write a XML config file you could almost as easily write a batch file. We'll keep the idea in mind though, especially since a batch file might get pretty unreadable once you had a truckload of options on there! [/quote] :-) |
|
|
[quote user="ivan"]This is an attractive idea and I'd be interested to hear what you would like to see generated by way of tests. What would you expect to see in the generated Read/SELECT tests, for example? Loading an object from the database and checking its property values? Against what -- values that you supply? How about Create/INSERT -- how far would this need to go -- would you be looking for us to look at the validation settings and create tests around the boundaries? [/quote] I have no problem to see that is many question to be answer here about this. At first perhaps it will be nice to test default CRUD function (with validation), just to "triple check" if the model is working correctly. But one thing that could be nice is to check references and Through Associations is working. I have experience now with a old models that everything looks fine until you try something that generate "invalid-model-exception"... My goal about this feature is to get "minimal validation" if the model works without blowing up... [quote user="ivan"] Do you envisage test generation being a one-off "generate a skeleton for me to fill in" or a "hidden file owned by the designer" (like the generated model code)? [/quote] I personally like "generate a skeleton
for me to fill in" most. |
|
|
lsgen.exe is missing a argument/property-in-config for turnoff "PluralizeTableNames" on the generate files. |
|
|
I think we would address your issue in a different way -- always singularise entity names (because a .NET class name should be singular), but provide an option to always generate a TableAttribute specifying the table name (thereby making the PluralizeTableNames setting irrelevant because TableAttribute overrides the inferred name). |
|
|
[quote user="ivan"]I think we would address your issue in a different way -- always singularise entity names (because a .NET class name should be singular[/quote] Yes, i'm agreed about that |
|
|
About generation of unit tests) With scissors, paper and glue I have make my "generic" tests both for data and services to test det model. It is working nice as "smoke test" if you have made som changes and wont to check it. This tests does not replace my standard UnitTest and IntegrationTests of course. Codesmith with generic, reflection and other fancy things does the trick... |
|
|
[quote user="csa"]I have written a lot of different whises to lsgen.exe, what about make a "xml config" for features, mapping, nameconvention, excludes, includes, output, stripping etc...so you can regenerate your model like this lsgen.exe /c:mymodel.config and lsgen.exe /c:mymodel2.config etc...[/quote] It could be nice also in this configuration file to add "static" attributes to the different entites. Lets say that you want EagerLoad on Orders.OrderLines, then it could be nice based on the config file that [EagerLoad] is added when running lsgen.exe. It could be nice to use lsgen.exe without manually changed. I try to make any extra entity feature from a partial class, but it is hard when the mapping is based on Attribute and lsgen.exe allways overwrite when changeing the model. |
|
|
Sorry, for not replying until now. Release planing days... I have created domain specific tests for all the ORM queries I make to the database, Naturally these tests would have to be updated too each time the domain model changes. But my tests do not necessarily cover all properties in the domain model class. Thus I realise they are not fool proof and that I easily miss stuff that instead will be cought in longer running integration tests. Such "easy bugs" caught late have proven to be very costly since these types of long running integration tests only run once each night. So basically, many small errors can mean a whole weak (weaks in our case..) without actual real business related integration testing that you would prefer at that stage. It can kill any deadline basically. [quote user="ivan"] Yes, I agree. But I can garantuee you that we in this team would love these auto-generated tests. Especially if you can make them run very fast so we can get this type of inconsistancy feedback early. Actually come to think of it, you could probably create synch verification tests w/o actually using SELECT statements for each entity. This should make the tests a lot faster. Nice! |
|
|
[quote user="csa"] I personally like "generate a skeleton
for me to fill in" most. [/quote] Hmm. mm. I'm unsure of what you mean. The less extra manual work I would have to do for synch stuff, the better. Personally I am not a fan of generating code, but for some stuff like this I would like it. But I would like it to generate everything for me, but with the partial keyword set on the class. And nowdays in C# 3 it is possible to have methods on partial too. But I am unsure how this would work for test methods. |
|
|
[quote user="Tobias"]Precisely. I would like, as CSA (Christian is it? :) also suggests,[/quote] Yes [quote user="Tobias"] [quote user="ivan"] Yes, I agree. But I can garantuee you that we in this team would love these auto-generated tests. Especially if you can make them run very fast so we can get this type of inconsistancy feedback early. Actually come to think of it, you could probably create synch verification tests w/o actually using SELECT statements for each entity. This should make the tests a lot faster. Nice! [/quote]I have make som "easy/stupid" tests with codesmith, generic, partial etc now, it really nice just to know the database and model is sync and not have any strange exception exploding around because of the "database changes you have made". I think it will be timesaver and a improvement of the code since you can fix the "easy bugs" early/fast. Another thing is that reading the documentation and understand every aspect of it could take some time... |
|
|
[quote user="Tobias"] Actually come to think of it, you could probably create synch verification tests w/o actually using SELECT statements for each entity. This should make the tests a lot faster. Nice! [/quote] Sorry, I was probably a bit unclear here, but I suspect that you already are on track with what I mean. I guess I could have made my previous posting a lot shorter having realised what I did in the end of it. Sorry for that. Anyways, what I mean is that we don't actually have a need to get generated SELECT tests for each entities to verify that the domain model and the database are synchronized. I think most databases allow for doing a kind of reflection upon schemas in which you can retrieve all the meta data such as column names, length, constraints etc. So the only thing we would need is generated NUnit (+ perhaps MSUnit) tests that verify this database meta data against the domain model in code. This should make this type of test run much faster with the benefit that you don't have to generate any test data at all. I guess you are already using meta data retrievals for the database when you allow for the synchronisation in the Lightspeed Model view so hopefully you have much of this already in place. It is just a question of generating som tests that we as customers can run on our build server. Ok, Christian have a point that generated tests also can serve as a tutorial. But you already have the provided sample code for this. So in order to get the much needed synchronisation validation fail early (and fast) tests, this type of meta data NUnit tests is the only thing we need.
|
|
|
Hello guys, I just wanted to let you know that although you're not hearing much from us on this thread we are paying attention! We're not going to be able to introduce this feature in the 2.2 timeframe but please do keep the ideas coming -- testability improvements are definitely on the radar for post-2.2 and we very much appreciate the suggestions. |
|
|
Ok, sounds fine. Although, it might mean that we perhaps have to implement my suggested sync solution ourself which would be specific to MySql, but agnostic to the number of tables, columns, constraints etc in a schema. If so (in a month or two is my guess) I can send you guys our solution for this in case you want to nick any ideas from it. You would obviously have to write a generic (and probably much better) solution for Lightspeed though. Just some input for your regarding testing that you probably already know and which I have been nagging about some already... The whole concept of test execution speed / early fail is extremly important from a Continuous Integration perspective (we've found that out the hard way in practice, not theory.). This is a problem when it comes to database testing since a tier to the database needs to be crossed. So it will always be expensive performance wise and therefore also slow.Although, one database call is not much cost, it quickly accumelates to long test harness time as more functionality/tests get added. I see (and we use) basically the folllowing different test CI categories/projects: 1. Unit Tests: run at every check'in. No, or very few tier crossings allowed. No test in a higher category can run if all lower category tests have not passed yet. Now, Lightpeeds generated test I reckon would be suitable for category 1 and 2. For category 1, I would like to have that sync check between database and model. Although a tier is crossed, hopefully only one database call will be made to retrieve this info, thus making it possible for this test to exist in category 1. Extremly good early feedback! In Category 2, tests can be generated to check different CRUD combinations. Database state for save operations is a tricky beast, but I reckon rollback in a transaction can solve this part easily. Though you can't generate tests that would actually test any business logic, I think these tests can be good to have anyways. If nothing else, you guys on Mindscape will be able to receive early feedback on possible Lightspeed bugs in a context not polluted by (for you) unrelevant business rules. I have one more auto-generate file suggestion (completly different) that I might as well take now too: In order for our Category 2 tests to work we have scripts that generate the schema, tables, etc for us. As a developer you usually manipulate the database first, and then at a later phase, once you see that everything works properly, you update the schema create script. But it is easy to forget this step and a category 2 or 3 test will fail which can be very costly... I would love an ability in Lightspeed to generate such a create script file for us based on the current version of the used database schema. I.e. push a "button" in Visual Studio manually and you'll have the schema create script updated for you.
|
|