My quest started with an innocent thought: "what if I had configuration file transformations instead of managing files via xcopy?" Previously I had already had some experience with this .net feature during my open source project deployment to appharbor. I didn't expect any difficulties mainly because the technology is rather straightforward and effective.
First I noticed that some web.config sections are extracted into separate files. So I tried implementing all the transforms in the web.release.config (with no luck of course). Each separate file required it's own transformation. That's how I met SlowCheetah. I read great introduction for the tool and realized that simply installing the VS extension is not enough - a way of propagating the MSBuild tasks to the build server is required. Fortunately here I found a detailed explanation.
At this point I had a few simple connection string transformations and all required changes in project files and nuget packages. I triggered a build and examined the contents of the _PublishedWebsites folder in my build drop location. The transformations wasn't applied. After some searching I realized that this is exactly how it should work. Transformations are applied only when publishing the site. I added an extra argument /p:DeployOnBuild=True to the MSBuild call from my build definition and got packaged web sites in the _PublishedWebsites folder. All the transformations were in place this time. And here real troubles started.
The output of MSBuild with /p:DeployOnBuild=True is basically a zip archive with rather tricky hierarchy. The desired published site lied deep inside the archive and some of its folders were build version specific (dynamic). I realized that working with the package using common tools was something considered wrong.
The first and the most obvious solution was using generated sitename.deploy.cmd file to deploy the package. The file used MSDeploy internally and required the tool to be installed and configured on all target environments. By that time I had all my builds set up and running with powershell xcopy-style deployment strategy (which is generally speaking wrong). So I decided that it was too much work to redesign all the stuff just because of config transformations and continued searching.
What if I could extract the contents of the package with msdeploy and put it into shared location? I wrote this:
"c:\Program Files\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -source:package=c:\Share\mysite.zip -dest:contentPath=\\my-pc\Share\Test -disableLink:AppPoolExtension -disableLink:ContentExtension -disableLink:CertificateExtension
and got the error
Error: Source (sitemanifest) and destination (iisApp) are not compatible for the given operation.
I didn't find the solution for the problem. Here are some links that might help: one, two. Please notice the solution from the latter one:
This occurs because the iisApp provider specified in the destination argument is not expecting a Manifest.xml file in the source. To resolve this issue, use the auto provider instead
But "auto" - was actually the same as my first attempt, so no luck here.
The third option was triggering SlowCheetah during a build, replacing configuration files in _PublishedWebsites explicitly. The approach was described here in more detail. Unfortunately I noticed that all configuration files was locked during a build. I found one possible workaround here but wasn't excited with it at all.
At this point I decided to work with mysite.zip package using powershell. I had already had some powershell deployment steps by that time so I thought it wouldn't be too much overhead to add another one.
Here I'm going to show you two auxiliary powershell functions I used to achieve the goal. They are rather simple and are assembled from various pieces found in the Internet here and there.
# Copy published site from deployment package to destination folder function Copy-PublishedSite { param($zipFileName, $destination) if (!(Test-Path $zipFileName)) { Throw "Deployment package is missing" } $shell = new-object -com shell.application Get-ZipChildFolders $shell.Namespace($zipFileName).Items() }
The function above performs input check (destination check is omitted) and calls recursive search function:
# Search for published site inside a deployment package function Get-ZipChildFolders { param([object]$items) $containerName = "PackageTmp" foreach($item in $items) { if (($item.IsFolder -eq $true) -and ($item.Name -eq $containerName)) { $shell.NameSpace($destination).CopyHere(($item.getfolder.Items()), 0x14) return } else { if($item.getfolder -ne $null) { Get-ZipChildFolders $item.getfolder.items() } } } }
This function traverses archive hierarchy tree searching for "PackageTmp" folder which is assumed to be a container for a published site. If folder is found the function copies its contents to the destination folder.
These functions worked fine in command prompt window but they failed during TFS build. CopyHere wasn't copying anything and didn't throw any errors. I didn't manage to make it work. Instead I decided using command line version of 7zip. Here is the code I got:
Although it's not a complete solution but the main idea is quite clear. First I copied the entire package to deployment folder (transfering an archive as a single file over the network is faster). Then I unzipped the contents of the package to the temporary folder "tmp" (folder already exists check is ommited). Then I copied the contents of "PackageTmp" subfolder into my sites directory. Finally I did some cleanup.
These functions worked fine in command prompt window but they failed during TFS build. CopyHere wasn't copying anything and didn't throw any errors. I didn't manage to make it work. Instead I decided using command line version of 7zip. Here is the code I got:
# Copy published site from deployment package to destination folder function Copy-PublishedSite { param($zipFile, $currentSite) Print-LogMessage "Copying zip package..." Copy-Item $zipFile.FullName $deploymentFolder Print-LogMessage "Unzipping package..." $tempFolder = join-path $deploymentFolder "tmp" $tempZip = join-path $deploymentFolder $zipFile.Name & $zipUtilityPath x $tempZip ("-o" + $tempFolder) -aoa -r Print-LogMessage "Creating target directory..." $targetPath = Join-Path (Join-Path $deploymentFolder "MySitesFolder") $currentSite Create-DirectoryStructure $targetPath Print-LogMessage "Moving package contents to target directory..." $moveFolder = Get-ChildItem $tempFolder -filter "PackageTmp" -r Move-Item (join-path $moveFolder.FullName "*") $targetPath -force Print-LogMessage "Deleting temp data..." Remove-Item $tempFolder -force -r Remove-Item $tempZip -force }
Although it's not a complete solution but the main idea is quite clear. First I copied the entire package to deployment folder (transfering an archive as a single file over the network is faster). Then I unzipped the contents of the package to the temporary folder "tmp" (folder already exists check is ommited). Then I copied the contents of "PackageTmp" subfolder into my sites directory. Finally I did some cleanup.
With the approach above I keep my existing powershell deployment strategy and have all config transformation features I need. I realize this solution is far from ideal and someday I'll have to move to msdeploy. But right now I don't see a strong reason for doing that.