jb's Blog


"behold the turtle, he only makes progress when he sticks his neck out"

Useful ASP.NET MVC Resource

tag icon Tagged as MVC

ScottGu has posted about a new sample ASP.NET MVC application which has been put together by some of the vocal members of the ASP.NET MVC team (Phil Haack, Scott Hanselman..) which forms part of their new book (also on the topic of ASP.NET MVC).

They have posted up a free PDF of part of the book which focused on getting started with ASP.NET MVC and building up the NerdDinner site – as well as having all the source code available on Codeplex for you to make use of.

Thanks guys!



Updates in the world of SQL Data Services

tag icon Tagged as SQL Server

If you have been interested in taking advantage of the cloud, then you will likely be interested in some news coming out of the SDS team. David Robinson has posted about some upcoming changes to SDS which brings on stream all of the SQL Server capabilities by allowing TDS (aka standard SQL Server protocol) access to connect to SDS from inside your Azure hosted application.

Full Relational Capabilities

So this brings on stream all of the standard SQL Server capabilities which most developers would be expecting and also really drops potential design constraints you might have been facing when looking at the much simpler ACE (Authority, Container, Entity) model which SDS previously offered.

IMHO this should really help drive adoption of Azure, and I’m hoping we will also see some tighter deployment capabilities getting released with the upcoming Azure CTP drops which will go a long way to allowing you to develop locally, deploy locally or deploy to the cloud without having to re-design the app to fit the environment. Good stuff!

So what about the ACE model?

From David’s post he suggests the ACE model will be discontinued over time – I suspect the Azure team will be keen enough to drop this as soon as is practical, but I believe if you still want to use the ACE model you can opt for the lower level Azure Table Storage approach. The other implication about the change is it would seem that the REST based API’s may also go away, or be replaced with Astoria based replacements sitting over SQL Server on Azure.

All in all a very promising move! It sounds like the first we will see of this will be around mid year, so we shall see how it all looks then. Of course we will be quite keen to get LightSpeed working well on the Azure hosting with this approach :)



ASP.NET MVC and LightSpeed – Building a Website

tag icon Tagged as LightSpeed, MVC, jQuery

There have been a number of recent posts discussing how to get started with ASP.NET MVC and various bits of kit, so I thought I would pop up my 2 cents on getting up and running with ASP.NET MVC and LightSpeed since I think they pair up quite nicely.

In addition we will also look at leveraging:

  • jQuery – for our client side goodness
  • NUnit – for our unit testing
  • Ninject – for managing our dependencies

What are we building?

Last year for Tech Ed I used a sample built around a Film Festival website. I thought I would refresh the sample to use the latest bits following an earlier comment asking if I was going to publish the code (which I was originally intending to put on Codeplex but ended up getting lazy and letting it slip..)

So the domain centres around a schedule of films running in cinemas around the country and our system allows people to locate films and then book some tickets.

What’s the plan?

Generally I find it best to start with the an initial cut of the domain model, then we can set up some unit tests to cover the model and after we are happy with that we can then look at getting the web site itself set up. Finally we can start building out some functionality.

So lets start with the Domain Model

Like most developers I am fairly used to modelling the data first using a tool like SQL Server Management Studio diagramming. I also happen to have the database I already developed for this solution last year, so I think we can continue with that. It looks like this:

Shot002

Note that the KeyTable table is some plumbing that we will use with LightSpeed to support our identity strategy. This is using the KeyTable identity pattern described by Fowler in PoEAA.

Creating our Domain Model using LightSpeed

Now that we have our data definition of the model, lets bring that into a working domain model by using LightSpeed to describe our entities. This is going to be pretty straightforward, since we can use the Visual Studio designer to drag on our tables from our database definitions and then extend them from there.

I am going to start by setting up a fresh solution with a single class library project called Model. This is where we will set up our domain model entities.

Shot003

Next we need to add a new LightSpeed model. If you have not already installed it (and why not?!?) – download and install the trial version of LightSpeed, which will add in the associated templates into Visual Studio.

Shot004

Once we add this, we get the LightSpeed design surface, which will allow us to drag tables from our server explorer over on to the surface and that code gen’s the associated entities into classes for us to use. Also because we are using .NET 3.5 it will also build us a LINQ context which we can use to start querying with LINQ straight away.

This should look somewhat familiar to our earlier ERD :)

Shot005

If you have a look at Model.cs which is a resource associated with the LightSpeed .lsmodel file, you can see what has been code generated – a set of partial classes with the field, property and relationshop definitions which match the model which we have described.

Shot007

So we now have a working domain model which we can start extending with some behaviour. To do this, we just need to create a partial class for the entity we want to extend. Let use the Cinema as an example, you will notice we have an object called GeoLocation which we are going to store as a SqlGeography data type in SQL Server. We may want to add a property to this to return us the co-ordinates of that location as a formatted string.

We would just create a new class file, Cinema.cs and fill it out as follows:

using Microsoft.SqlServer.Types;
using System;
 
namespace Model
{
  public partial class Cinema
  {
    public string Coordinates
    {
      get
      {
        if (GeoLocation == null) return "0,0";
 
        try
        {
          SqlGeography instance = SqlGeography.Parse(GeoLocation.ToString());
 
          return String.Format("{0},{1}", instance.Lat, instance.Long);
        }
        catch (Exception)
        {
          return "0,0";
        }
      }
    }
  }
}

We now have our initial Domain Model

In a very short space of time we have taken our existing data model, represented this as a domain model by using LightSpeed to reflect the data schema and automatically generate a class definition for each entity we have described. We can take the Model assembly and in conjunction with LightSpeed we can fetch and persist these entities against a database.

Playing along at home?

Here is the code and associated database setup script (remember to create a database called FilmFestival and run the script under the context of that database first!) to cover what we have done so far. Also remember to download and install LightSpeed

Download FilmFestival2009_Part1.zip (8KB)

Ok – What’s next?

Next we should look at setting up some tests for the model, and then get our web infrastructure underway using ASP.NET MVC..



Setting up a new project at Mindscape

One of the questions we regularly get asked is how we go about building our software. Like most development teams we use quite a bit of assistance in the form of automated scripts, supplementary tools as well as Visual Studio for getting the job done.

So to give you a better insight into how we do things, I thought I would quickly elaborate on what we would do if we were setting up a new project to highlight where we are getting some leverage from tools.

1. Start with the Right Structure

One of the things which always helps is setting up your source code and associated assets in a proper structure. The first thing when setting up a new project is to get organised. We typically find ourselves storing code, reference assemblies, documentation and tools.

So our initial structure looks like this:

SolutionStructure

This structure is pretty standard if you have worked on OSS projects or the like. Note, all the source code goes under Src :) Having a consistent structure means I know where to find things. Why would you not do this?

 

2. Put it in Source Control immediately

We primarily use Subversion and maintain a number of repositories for the various things we work on – e.g. One for Mindscape, one each for side projects, one for our work on Valuecruncher etc etc.

If you are using SourceSafe – get off it immediately and look at Visual Studio Team System, Subversion or SourceGear’s Vault as alternatives. Ultimately you want a solution that integrates well with how you are performing your development and provides enough capabilities for you to not get caught up spending time worry about it. For us, this meant something that could work across the Internet (so we can work at home some of the time!), be decoupled from Visual Studio (because we love the file system) and was easy to get installed and running with. We use TortoiseSVN in conjunction with Subversion for tooling – this works really well for us.

One of the very useful views over our Subversion repository is the timeline view of activity.

TimelineView

We use Trac for this which Andrew introduced to us very early on in the Mindscape life, but really its just an enumeration over the change sets so can be achieved regardless of how you manage your source – TeamCity also allows you to get the same view. This is amazing for watching what is going on – how frequently are people committing, what are they committing and also you can continuously code review by watching the changes.

 

3. Make it Portable

I tend to work across a number of different machines (work, home, roaming, VM’s) and often want to work on a project from all of these environments. Also we have a build server who wants to help as well by building and testing things for us :) So it is critical that the whole solution and everything required to build it should belong within the confines of our source controlled structure.

In our structure above, the Lib and Tools directories have a big role to play here. Any references assemblies are referenced out of Lib – our only real exception to this is references from the .NET Framework. Any tools we need to use to either build, deploy or test the solution are stored under the Tools folder. This means our build server in particular can grab a clean copy of this each time it needs to do a build and can actually build it without us having to spend hours setting up dependencies on the box.

One of the other important things is to make sure that an environment can be provisioned largely from scratch. If you are familiar with Unix conventions, you will know about Makefiles and the role they play. I look to set up something similar but generally with batch files or Powershell scripts to provision the environment.

 

4. Automate the build as soon as possible

We use TeamCity to manage our automated builds, both integration and nightlies – it is a fantastic product and well worth looking into if you don’t have something already. We used Cruise Control at Mindscape prior to picking up on TeamCity, and my personal take is that TeamCity is massively superior in all aspects, but mostly for just removing the administration hassle in setting things up.

TeamCitySample

At a glance you can see everything is good in the world of LightSpeed :)

TeamCity acts by monitoring our Subversion repositories and based on the appropriate trigger (e.g. code check in) will grab the latest copy of the source and kick off a build. We script the builds themselves by using MSBuild which generally will do these things:

  • Build each code solution involved in the system
  • Build any test structures required (e.g. databases – those early build scripts come in handy at this point!)
  • Run all unit tests with code coverage
  • Run any supplementary static analysis checks (e.g. StyleCop)
  • Run any supplementary actions (e.g. Build an installer)

One of the advantages of having using an MSBuild script is you can always crack open a command line locally and run this yourself. It should succeed, or you should be breaking the build (no, that’s not a good thing!).

Often we have our build process automatically deploy to a test environment (e.g. update a website). Again this just saves you time and increases your confidence in making changes since you can immediately get feedback.

 

And we are done..

Nothing I have described above is particularly hard, especially once you have incurred the one time setup costs of getting Subversion and TeamCity installed somewhere. The benefits are huge though – working software from the get go, quick feedback when I make changes (and break the build..) and no wasting my time having to “prepare a release” or the like.

One of the most valuable investments you can make in a project is to get the process and tooling right up front – this leaves you free to focus on what matters – writing good code.



Loving ASP.NET MVC

tag icon Tagged as LightSpeed, MVC, jQuery

 I thought I would share a few thoughts about ASP.NET MVC, as I have really enjoyed working with it over the last year. As you are likely aware this has been an “out of band” release, but is going to be bundled with .NET 4.0 so we can expect it to be a mainstay in the .NET world going forward.

While not the most sophisticated pieces of technology out there, it stands out as a great example of Microsoft “getting it right” in recent times. The openness of the team, the responsiveness to feedback and the good design of the framework are what stand out for me.

For the latter part of last year, JD and I spent some time building out a reasonable size ASP.NET MVC implementation for Givealittle.co.nz which gave us some interesting insight into building a real world app on the framework.

The solution itself comprised of around 10000 lines of code, a suite of ~600 unit tests and is broken into 6 projects (Common, Model, Security, Web, Routes and UnitTests) – an integration build takes ~1 minute on our build server.

So how did it work?

Back when we first kicked off the project, the bits were at Preview 1, when we finished they were at Beta, now they are at RC. So quite a bit has happened in between, but the solution hasn’t changed too much architecturally and looks logically like this:

MVCLogical

As you can see we deviated a bit from the “out of the box” offering with MVC on a few fronts:

We chose to use NHaml for our View Engine for two reasons, first we really liked the symmetry with Rails, and secondly it made it feel more connected to the HTML and CSS being used in the solution. If you are in the same camp you should give NHaml a look. There are some other interesting view engines available out there so remember to shop around and find one which gives you the best fit.

For the RIA aspects, we used jQuery – a totally awesome JavaScript framework which if you are not using – you should be! And of course we totally loved seeing the team pick up on this as well so this is actually now “out of the box” as such :)

We chose to use LightSpeed (of course) for our domain model, as this offered us a lot of advantages in getting up and running quickly (we generally modelled on paper, set up the entities in the database and then code-gen’ed our entities through the Visual Studio designer). All the behavioural aspects of the model were then added through partial classes. Another big advantage we got from using LightSpeed entities was the model validation which we used to manage a lot of the validation burden for form unbinding since most of it targeted new or existing entities in the model.

We used WCF to handle the distributed aspects of the solution (mainly in dealing with payment providers and the like). As most “real” solutions these days are likely to have some kind of distributed resources that they need to work with I think this would generally be considered “out of the box” :)

So what did we learn?

1. By itself, the framework alone is not going to cut it for most real world apps. And by this I mean there are plenty of optimizations you are likely to make when building out an application on MVC which mean that you will ultimately develop your own “wrapper framework” which is based on MVC. Example of this for us were handling routing, adding extensions for link generation, adding in custom action results, adding in custom filters for concerns such as security and so on.

2. The model binding capabilities are great. Learn about how to extend these to more effectively manage your incoming data.

3. Routing can quickly become a PITA when your URL structure is not {controller}/{action}/* – most of the samples and guidance out there seem to prescribe an overly generic site structure – nice if you can handle it, but generally there are going to be plenty of edges to deal with. In Givealittle we had a custom structure with many of the concerns being duplicated across multiple domains in the site. e.g. Adding a blog post to Organisations and Projects – clearly we didn’t want to have the functionality duplicated in both of these controllers to save a few additional routes. The new routing API is great, and a welcome addition to the framework, however can be quite unwieldy for adding routes manually with the use of Constraint collections, Defaults collections and the like.. We simplified this for ourselves by using a helper which would build up a route on our behalf based on this style:

RouteBuilder

.Get() indicates it is a GET request for the resource such as /login which would then map to the Home controller, Login action.

So how did it compare?

Having now built solutions using WebForms, Monorail and even Rails, there are a number of clear benefits I think can be gained by using MVC.

These would be:

  • Well suited for internet facing solutions where tighter control over the HTML/CSS matters
  • Improved quality from higher test coverage – the controllers are easily testable
  • Paired with the Routing framework you get great control over how the site structure is mapped
  • Simpler – there is a lot less going on with ASP.NET MVC compared to say WebForms so it was much easier to be productive and later on much easier to refactor. Compared to say Monorail, the lack of a lot of the “out of the box” infrastructure was also useful as there was less to contend with up front. I think this makes it more approachable for developers.

Of course you are closer to the “bare metal” so you do have to have a better understanding of the web. Keep that in mind.

If none of those benefits apply or are high priorities, then you might find you are best off sticking with something else, but I reckon MVC is the way forward for building web applications on the .NET framework :)



Introduction to Oslo

tag icon Tagged as Presentations

Thanks to everyone who came along to the session on Oslo last Thursday at the Auckland Connected Systems User Group – was great to be there for the first time :)

As indicated during the session, Ive uploaded the slide deck and the M file we used during the quick look at the M language for you to have a play with – remember to download and install the Oslo Jan SDK to make use of these.

The Oslo Dev Center is the place you want to go for all things Oslo or M related, and also check out what some of the bloggers out there are doing with M so far..



Architects Forum – Cloud Computing

tag icon Tagged as Presentations

A couple of weeks ago I gave some presentations at the most recent Microsoft Architects Forums in WLG, AKL and CHC. The talk was on the emerging offerings in the cloud computing space and from all accounts, this is a topic which is hot on peoples minds given the level of banter we ended up having at the end of the sessions :)

For those who came along and wanted to get a copy of the slides, I now have these available for down – grab them here.

And for those who didnt make it along.. Mark Carroll is going to be running some more forums in the New Year – so get in contact with him if you are interested in coming.

Or..

.. if you are a developer in the North Island you are probably already taking part in the Unplugged events touring round the country this week – if not, check it out! :)



SQL 2008 Training Followup

tag icon Tagged as Presentations

Last week I was in Wellington, Auckland and Christchurch running some day long sessions covering off what is new across SQL Server 2008 and drilling down into some of the new features.

The sessions were a lot of fun – a big thank you to everyone who attended. The feedback you have given so far has been awesome, and both Darryl and I are really pleased that it hit the spot and that there was a great community vibe in each location – hope you are all planning to attend the upcoming PASS Community event or a user group near you soon!

I have a few posts to put up following on from those events, but the first one is just making the slides available for you all to grab. I have split the sessions up into the 5 sessions we ran on the day and have zipped them for you.

Grab them here:



jb's Blog