Sunday, 20 June 2010

Technical Reveal All: Automate Database Build With Team Foundation Build (part 1)

My previous post was mostly a rant about the vagaries of working with VS 2010 (Ultimate) - TFS and Team Foundation Build specifically.  Just to recap: I was struggling to get my automated build with Team Build to incorporate the necessary database build and deploy (including test data insert).  I could manually build and deploy the database, and insert tests data by running the Test Generator to kick-off the pre-defined Integration Test Plan.  I could even tie all this up locally to automate these steps when running a test build (via the TestSettings file).  However, the difficulties come when trying to achieve the same thing on the build server.

As I confessed during my complaints-based post, I suspected it was as much about my lack of experience as it was about the capabilities of the Studio build tools.  I readily admit that this may well still be the case and my solution isn't the most eloquent. Well, my build works just fine no matter how ineloquent it is, and I won't have the time or inclination to improve it until it either breaks or I attack this whole Application Lifecycle Management (ALM) issue on another project. 

However, what alarmed me more than anything was the lack of readily available information on this particular topic when I googled.  Therefore, this post is designed to offer my experience and my way of achieving what I consider a basic and essential automated build feature in .NET.

Getting Started: Database Projects (Part 1)
This is the first time I have used Studio database projects. The majority of the projects I work on are object-centric (that is, the database is simply for persisting data and nothing more - no complex stored procedures and certainly no database triggers to be seen on any of my projects if I can help it).  As a result, up until now my database code has been deliberately sparse; a database creation script, table creation scripts, and test data scripts for unit/integration tests of the Data Mapper as well a batch file for effectively bootstrapping the database build as part of the automated build. I'm not a database guy so the simpler and less there is, the less that can go wrong and confuse me.  My Dad always used to use this when justifying his car puchases: the less working parts or gadgets, the less there is to fix.  (The old man had a point, I just drew the line at the Seat Marbella he had for a while.)
However, I thought I'd give the Studio database project a go to get into the spirit of the whole MS toolset.  I have to admit that what I've found so far has been pleasing and easy to deal with. I can extract the scripts directly from a working database to include as part of my deployment to test or live.  What I actually end up with doesn't amount to much more than what I get with my scripts written directly, but its easier getting there and I don't have to debug them to ensure that they work- they just do (I include test data generation in this also).

So, first of all create a new database project as part of the solution.  Secondly, go to the Schema Explorer and create your tables much in the same way you would in SQL Server using the Management Studio user interface.  Then, build and deploy your database from the Build menu in Studio.  What this does for the uninitiated like me is the following:
  1. The build creates creates your database scripts for you based on what you have produced in the Schema Explorer
  2. The deploy runs the database scripts just created in the build step
  3. When you build the solution, the database build is run like any other project and when you run your solution, the build and deploy are both run (not as build pre-requisite tasks just as an integral part of the solution)
OK, so this is all good but now you need to harness the true power of database projects to make their adoption worthwhile - data generation plans.  When you look at the database project within the Solution Explorer in Studio you will see a Data Generation Plans folder (which will contain *.dgen files).  You can have multilple data generation plans (i.e. different configurations) for different purposes (unit testing, integration testing, system testing, etc).  The Data Generation Plan window is split into 3 panes:
  • Table View
  • Column View
  • Data Generation Preview
To build your data generation plan you essentially select each table from your schema in turn via the Table View and then configure the data to be automatically generated for each column of the table using the Column View.  You can manipulate individual columns by editing the column properties.  This is very flexible as the data generation properties of string columns allow use regular expressions to generate meaningful data, even selection of one of a number of parameters (achieved by use of pipes, e.g. - "string1 | string2 | string3") and data generation properties of integer columns allowing upper and lower bounds for numeric values.
You can then preview the data that will be generated using (you guessed it) the Data Generation Preview pane, and then generate the data (insert the data into the database tables) via the Data | Data Generator | Generate Data menu.  After selecting your database connection of choice your database is now populated with the test data. Whats more, it is saved to a .dgen file to re-use.
To automate this is reasonably simple once you know how, but still not as intuitive as I would like. 

Ok, so automated data generation using Data Generation Plans is useful for generating data. This data doesn't necessarily have to be for testing purposes I guess, but more than likely it will be.  However, so as to not tie the data generation to test data only, the data generation plan (part of a database project) can be hooked in to a test project via a context-sensitive menu for Test projects; Test | Test Database Configuration.  From here you can configure the database scripts to deploy the database every time a test build executes as well configuring a data generation plan to run to insert test data after the database has been deployed.

And there we have it, a way for getting the database build and deploy (including test data) as part of the full automated build.  Well, on the client at least....Part 2 looks at automating this on the server.

It should be noted that this is not something exclusive to Studio 2010. As far as I'm aware, database projects and Data Generation Plans have
been around since Visual Studio 2008, albeit in the Data Edition.  Differentiating between object code and database code in separate developer editions was not a smart move by Microsoft in 2008.  It may have meant additional licence sales but it didn't match with the make-up of an average commerical .NET development team - application developers need to write database scripts sometimes too! 

(Thanks to for use of the Johnny Ball image.)

Powered by ScribeFire.