This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
We have a situation where there is a periodic cleanup of databases. there are over 40 databases involved, all different and all changing (please don't ask how this came about). Maintaining ongoing context models of themis unrealistic.
Periodically we receive a stack of trnasactions that look like this: stepNo int serverName string dbName string tbName string keyName string minVal int The steps to be taken are basically (in SQL) delete from serverName.dbName.tbName where keyName >= minVal. A simple batch operation but with no knowledge in advance of what we'll find in the stack and there can be thousands of them. The primary issue is that the deletes have to cascade. Cascading deletes is not supported in the database design so it is not automatic. LightSpeed seemed to be the ideal solution except that the context requires advance notice of the database structure.
I can construct the connection string from the server and database names. Is there some way to exercise the deletes without knowing the tables involved until runtime - basically getting all the information in real time? Can the context be built as part of the runtime process or can LightSpeed act on "unknown" tables given a valid connection string and the rest of the information for each transaction? Your help would be much appreciated. Our organization is new to LS and we are trying to implement it throughout. Thanks. |
|
|
No, LightSpeed can't act on "unknown" tables. You need to have entity classes. And for cascade deletes you need to have associations between those entity classes. However, if you could construct entity classes at run-time, then LightSpeed would be happy to work with them. That is, the entity classes don't need to be baked into the application, they only need to be there once you start hitting the database. And the core API allows you to work with entities in a weak-typed way, e.g. UnitOfWork.Remove(new Query(tblType, Entity.Attribute(colName) >= minVal)); And there are a couple of ways to do exactly this. One way is to use the lsgen command line tool. You can point lsgen at your database and it will spit out C# or VB code for entities that reflect the database structure. Compile that code (by calling out to csc.exe or vbc.exe, or by using the CodeDom providers) and you get an assembly containing entity classes. You can now load that assembly using an Assembly.Load* method. PRO: Gets the structure directly from the database, so totally automatic. CON: Database inference can be confused by legacy schemas (which it sounds like yours is!); a bit slow (though that probably isn't an issue for a periodic cleanup operation). A more sophisticated way is to use Reflection.Emit or CodeDom to generate classes on the fly, for example from a configuration script that specifies the columns and associations. This is a bit fiddly at first but LightSpeed entities are pretty regular so it's not too bad once you've got your head around it. PRO: Allows you to specify the schema. CON: You'll need to write the class generation code yourself, and some devs find Reflection.Emit and CodeDom a bit intimidating so there might be maintenance issues. A possibly hybrid would be to put the configuration script through a code generator/templating engine like T4. This would avoid dealing with the complex Reflection.Emit and CodeDom APIs, while still giving you the precise control of a configuration script instead of relying on inference from the database. Note that you do NOT need a .lsmodel file. The entity classes are all that are required. Hope this makes sense -- let me know if you need any more info. |
|
|
Thanks Ivan. Here are my thoughts: Only one solution is automatic, the one using lsgen, and you don't sound very confident about it working with our schemas. Also, I would need a little guidance on the implementation since I am not familiar with the products and also relatively new to .Net/VS though not to programming or any other IT discipline. What I see in real life is that I can go into VS, add a new lightspeed model and drag and drop to create entities, all fully automated. What I was hoping for, I guess, is some way to, as part of the runtime process (C#):
That is what I am looking for. My alternative may be to issue SQL commands (via Linq's ExecuteCommand()) and walk down the tree using system tables. If you can think of some way to do that I will definitely use it (and speed up the purchase of the necessary licenses). If you think lsgen is a viable alternative can you please point me to some useful documentation and explain how to use it in a C# program.
I have used my personal email for this, but my company, Telarix, produces an application for large telecommunication companies and my group is hard at work trying to improve many aspects of this product. LightSpeed is one fo the first enhancements. Our homegrown data layer is not very good; it will be replaced with your ORM. My little utility is not the primary use by any stretch of the imagination but if I could get it working quickly it sure would speed up the process and make my life a lot easier. Thanks much. Dan |
|
|
I have seen Reflection Emit being mentioned else where in this forum which got me looking into it. There is a good article from MS here: http://msdn.microsoft.com/en-us/library/ms973916.aspx While we don't have an immediate need for it having that capability on hand could be very valuable. Our data layer was also designed to resemble an ugly monster, and our data is spread across multiple legacy systems so creating something as simple as one co herent "Customer" business object that itegrates all the various data layers becomes a real headace. Ivan, have you had any other customers succeessfully implmement a working example that utilizes Reflection Emit. In the future is there any possibility that Mindscape could provide T4 or some other out-of-the-box support so we can hit the ground running with Emit? Just a thought... Thanks. |
|
|
Hi DCDan, The reason I was expressing reservations about lsgen was that your database setup sounded very 'legacy.' lsgen works well with databases that are 'well behaved' but can require manual intervention for databases that are not so well behaved. The main things that can cause problems are (1) composite foreign keys and (2) primary key columns that are also foreign key columns. If these scenarios don't turn up in your database then lsgen should meet your needs. However if your schemas are going to involve lots of composite and cross-cutting keys then I'm afraid we don't have anything fully automated for you -- sorry. If the composite key stuff isn't a showstopper, then using lsgen as part of your utility is a bit laborious but not too difficult. Here's a rough outline. 1. Invoke lsgen using Process.Start: Process p = Process.Start("lsgen.exe", "/p:MySql5 /l:cs /o:tempfolder /n:TempNamespace /c:" + connstr); 2. This leaves a bunch of C# source files in the tempfolder directory. Compile them against the LightSpeed DLLs: Process p = Process.Start("csc.exe", "/t:library /r:Mindscape.LightSpeed.dll /out:tempdll.dll tempfolder\*.cs"); 3. Load the generated assembly and locate the relevant entity type: Assembly a = Assembly.LoadFrom("tempdll.dll"); 4. Issue your Remove query: LightSpeedContext context = new LightSpeedContext { I've ignored error handling here, and I've also avoided CSharpCodeProvider (I would normally use this for compilation but the "shell out to an EXE" idiom is probably more familiar if you haven't previously done any dynamic compilation -- do check it out though). I've also assumed that this is a utility which will exit after running, so we don't have to worry about unloading the dynamic assembly, and of course I haven't cleaned up all the temporary files (again CSharpCodeProvider can help avoid this). Hopefully however it is enough to get you started and we will be happy to support you as you refine it. (By the way, for lsgen documentation see Help Topics > Modelling > Command Line Tool in the help file.) |
|
|
Hi SR8, Yes, you can use T4 to generate entities. LightSpeed entities are just code: there is no need to have a .lsmodel file. So you could write a T4 template (or Reflection.Emit) that pointed at your database and generated LightSpeed entities from that. However I am not sure how T4 or Reflection.Emit would help with the issue of a single business object spanning multiple legacy systems. Both T4 and Reflection.Emit require a program (expressed in T4 as a template, but it's still programmatic logic) to tell them how to generate code from wherever: as far as I can see, they'll only solve your problem if your business objects map to legacy systems in a predictable way (or at least a way that can be described in a program), and of course one of the defining features of legacy systems is their *un*predictability! However feel free to share more about your environment and how you see code generation via T4 or Reflection.Emit helping to automate away the integration problem -- we would be delighted to offer suggestions or to advise on the specifics of dynamically generating LightSpeed entities! |
|
|
Thanks, Ivan. A "legacy" database, well designed in the early 90s would be a good "well-behaved" datqabase. These were designed a lot more recently, but not very well, I'm afraid. I'm trying to break some of those old habits. many:many relationship tables almost always have composite primary keys that are also foreign keys to the parent tables. Often, those are the only columns in the table. I suspect there are a number of those so lsgen is out. Thanks for your help, but I think this time we do it with SMO. By the way, your helpful hints were very good and I appreciate all the extra effort. I have an app coming up where they will prove quite useful. I still believe, however, that run-time context creation/ entity absorption, would be a great addition. Also, the level of support you have shown greatly improves my comfort with the fact that we have chosen the right product. We have between 30 and 50 developers at any given moment so a site license would be the obvious choice. Thanks very much. Dan |
|