In my last post I described adding some unit tests to one of our core libraries that three projects currently being developed are using and finding a bug in the code. Well of course I fixed the bug, but now what?
Now each of the teams working on the other projects will have to download the new compiled core library dlls, replace the current dlls in the project with them and rebuild. And I will have to do the same for all of the projects on the build server... just kidding - that's the "old way".
So how does the "new way" work? What actions are required by the "new way"?
I'll answer those two questions in reverse order.
The answer for question #2 is: None. That's right no action is required by each developer using that core library. What's that you say? No, I am not crazy. Actually, that was a little white lie - I am a little crazy - and it does require the developer to double click a batch file to update to the new core libraries on their dev machine. And as far as the build server goes - I really have to do nothing to integrate the library's new dlls into each of the projects using it.
How it works:
On the build server: I have that core library setup, like the rest of our projects, under a continuous build process (using CruiseControl.NET, NAnt, MSBuild, NUnit, NCover, FXCop, NDepend, etc). Specifically for the core library project, after it goes through a successful build the resulting dlls are committed to another Subversion project we call "Lib". Lib is a repository that contains compiled assemblies that we commonly use - the Enterprise Library, other third party assemblies, and our core library. They are each setup in their own folder with folders containing different version under that along with a "current" version.
If a project uses one of the assemblies or the core library the build file for that project states so and which version it uses. One of the common build tasks for all projects is to do an SVN update on the Lib repository and pulling any updates (including the new core library updates I just made) down to a local Lib folder shared by all projects on the build server. Then any assemblies used by the project are copied into the project's bin folder, replacing any older versions. After that the build process for the project proceeds as usual - using the new core library updates in our case. Of course the project's compilation and unit tests will discover any issues it might have with changes to any of the dependencies.
On our developer's machines: Each project's build file, along with batch files to automate them, are included in the source code repository. So when a developer checks out the code for a project he/she gets the build files and batch files, too. One of the batch files calls a build script that pulls down the libary repository and copies the needed assemblies into the project's bin folder (same as it works on the build server). All file paths used by the build script are passed in from the batch file using environment variables so the paths can be different on each developer's machine. There are 3 file paths stored as environment variables - the path to nant, the path to Tortoise SVN, and the path to the developer's local Lib folder. The developer only needs to setup the 3 environment variables once because they are the same 3 used by every project's batch file and should not change. So, all the developer needs to do is run the batch files to update all dependent libraries.
While I would like to, I cannot claim any of these ideas as original - these were all adapted from Marc Holmes great book - Expert .NET Delivery - Using NAnt and CruiseControl.NET.
Now I just need to get an email sent out to everyone whenever the Lib repository is updated so we will know when to run the batch file - but that's a task for another day.