Feb 9, 2014

Acceptance Stage in CI Jenkins and Psake

Related posts:
The “ACCEPTANCE STAGE” is the third job in my delivery pipeline.
Please, refer the general article for setting up the delivery pipeline with Jenkins and Psake here.
You can take a look on the article dedicated to preceding jobs in my pipeline for build stage here and commit stage here.
The primary job of  acceptance stage is to execute acceptance testing, which in my case is based on Selenium. Its job can be summarized with 2 major activities:
  • Deploy the artifacts that are output of the  build stage instance, and that are tested by the  commit stage instance.
  • Executing the Selenium tests with NUnit
1. Deploy artifacts
Deployment task itself is comprised of 2 sub-tasks:
  • Deploying the Web Application to dedicated Front-End server, that is not the build-server.
  • Deploying the database with SSDT to dedicated database server, that is not the build-server.
To get the REVISION context from the upstream job in the pipeline instance (which is the commit stage), mark the job as parameterized and define the same name that is pointed in the build trigger for the “COMMIT STAGE” job.
AcceptanceStageDefinition

The “build” step is quite simple in this case. It contains only “build.cmd”, that was reviewed in the previous articles.
The functionality related to deploying web application and database are executed in “acceptancestage.ps1”.

If you don’t use “msbuild” command line for deploying your web packages to dedicated machine ( that is accessible in your network), you have to implement this with custom programmatically logic.
In my case I did it with Powershell WebAdministration module.

So, the cmdlets from the module that creates web site and applications have to be executed in the context of the machine that they will reside in. And this machine is different than the build machine that host Jenkins and triggers the instance of the “ACCEPTANCE STAGE” job.
One possible option to fulfill this is to build web service, responsible for deployment and web site creation. It can be invoked by the build machine. Unfortunately, this implies to more development efforts, integration and authentication concerns, which technically becomes a complication in the delivery pipeline.

The other approach (the one I chose) is to use PowerShell remoting, which will allow me to call the cmdlets from WebAdministration module in the context of the web server, where I want to deploy my web package.
There is a prerequisite for this. The front-end server (webhost in the script), should expose shared folder, in which the artifacts will be copied before executing the deploy DeployWebProject.ps1 with remoting.
The script uses WebAdministration module of IIS, loops through all published web sites on the front-end server in order to extract and calculate the port of new site that is to be published.
Then New-Website and New-WebApplication command are used. After the web application is deployed, its web config is modified , so the connection string has been properly set up.
DeployWebProject.ps1 is located on the Jenkins server, but if it refers other scirpts or resources they should be placed on the “remote” front-end server.

task DeployApplication { 

    $webhost = "\\webhostIP" 
    $webhostPassword = "webhostPassword"
    $dbhost = "\\dbhostIP"
    $dbhostPassword =  "dbHostPassword"
    Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Start DeployApplication task..."
    If ($revision)
    {
        Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Start deploying application..."
        $p = Resolve-Path .\       
        #Authenticate as User1 with needed privileges.
        $password = convertto-securestring $webhostPassword -asplaintext -force
        $credentials = new-object -typename System.Management.Automation.PSCredential -argumentlist "webhost\User1",$password
        #copy artifacts
        Write-Host "Moving artifacts packages on the front-end server hard drive..."
        $artifacts_directory = Resolve-Path .\Artifacts\$revision\Source\Packages

        NET USE "$webhost\ci\$revision" /u:webhost\User1 $webhostPassword 
        robocopy $artifacts_directory  "$webhost\ci\$revision" WebProject.zip
        net use "$webhost\ci\$revision" /delete
        Write-Host "Moving artifacts done..."

        #copy deploy scripts
        $deployScriptsPath =  Resolve-Path .\"DeployScripts"    
        NET USE "$webhost\CI\powershell" /u:webhost\User1 $webhostPassword 
        robocopy $deployScriptsPath "$webhost\CI\powershell" PublishWebSite.ps1
        net use "$webhost\CI\powershell" /delete      
        $dbServer = "dbServerConnectionStringName"  
        $dbServerName = "dbServerName"
        $sqlAccount = "sqlAccount"
        $sqlAccountPassword = "sqlAccountPassword"  

        invoke-command -computername webhostIP -filepath  "$p\DeployScripts\DeployWebProject.ps1"  -credential $credentials  -argumentlist @($revision, $dbServer, $revision, $sqlAccount, $sqlAccountPassword)     

        Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Start deploying database..."
        #...pretty much the same
    }
    else
    {
        Write-Host "Revision parameter for the Acceptance Stage job is empty. No artifacts will be extracted. Job will be terminated..."

        throw "Acceptance Stage job is terminated because no valid value for revision parameter has been passed."
    }
}

The link below gives explanation how the credentials might be stored encrypted in a file, rather than being used plain text in ps1 script.

http://blogs.technet.com/b/robcost/archive/2008/05/01/powershell-tip-storing-and-using-password-credentials.aspx

Below is the beginning of the DeployWebProject.ps1 script:
param(

 [string]$revision = $(throw "revision is required"),
 [string]$dbServer = $(throw "db server is required"),
 [string]$dbName = $(throw "db name is required"),
 [string]$sqlAccount = $(throw "sql acocunt is required"),
 [string]$sqlAccountPassword = $(throw "sql account password is required")
 )
    $p = Resolve-Path .\
    Write-Host $p    
    Set-ExecutionPolicy RemoteSigned –Force


robocopy with impersonation is used, because the context of the Jenkins jobs has not permission over the shared folder by default. The context is the user that runs the windows service.

robocopy copies the artifacts from C:\CI\Artifacts\$revision to $webhost\ci\$revision (this might be created dynamically by the deploy ps1 script).

After the web deployment is completed the shared folder content is cleared.

Deploying database uses pretty much the same approach. Artifacts that are need here are:

  • Dacpac file
  • Publish database profile file
  • Init.sql scripts that can be used for creating the initial data needed for the web application to be operational.
*.dacpac file is output file of your SSDT project and should be archived as artifact in the “Build Stage”, after database project is build. It is available in the bin\Debug|Release folder of the SSDT project.

In order to generate database publish profile, open up your ProjectDB.sln and right click on the SSDT project (named ProjectDB) -> Publish.


CreatePublishingDBProfileXml


The click “Save Profile As…” and save the file as ProjectDB.publish.xml. The file is stored on the Jenkins file system.

Below is sample content of the file:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <IncludeCompositeObjects>True</IncludeCompositeObjects>
    <TargetDatabaseName>4322</TargetDatabaseName>
    <DeployScriptFileName>4322.sql</DeployScriptFileName>
    <TargetConnectionString>Data Source=dbserver;Initial Catalog=xxx;User ID=xxx;Password=xxx;Pooling=False</TargetConnectionString>
    <ProfileVersionNumber>1</ProfileVersionNumber>
  </PropertyGroup>
</Project>

Artifacts are copied to shared folder $dbhost\CI\$revision on the database server. Then, again PowerShell remote execution is used alike for web site deployment.

The publish xml is copied on the $dbhost\CI\$revision folder from the Jenkins machine file system location. Its content and connection string are adjusted. The newly created database has the name of the revision number, so it can be easily recognized when troubleshooting is required. Also the publishing profile name is renamed to ProjectDB.publish.$revision.xml, after it is copied to the appropriate folder.

Below is the deployment script for the database:
param(
 [string]$revision = $(throw "revision is required"),
 [string]$dbServer = $(throw "db server is required"),
 [string]$sqlAccount = $(throw "sql acocunt is required"),
 [string]$sqlAccountPassword = $(throw "sql account password is required")
 )

    $pathToPublishProfile = "C:\CI\{0}\ProjectDB.publish.{0}.xml" -f $revision
    $dacpacPath = "C:\CI\{0}\ProjectDB.dacpac" -f $revision
    $remoteCmd = "& `"C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe`" /Action:Publish  /Profile:`"$pathToPublishProfile`" /sf:`"$dacpacPath`""

    $sqlInit = "sqlcmd -S {0}  -U {1} -P {2} -d {3} -i `"C:\CI\{3}\ProvideInitData.sql`"" -f $dbServer, $sqlAccount, $sqlAccountPassword, $revision

    Invoke-Expression $remoteCmd
    Invoke-Expression $sqlInit
    Write-Host "Creating database finished.."


As you can see the artifacts are referred as local resources to db server, regardless the DeployDB.ps1 is executed from the build server.

After successful deployment the temp content $dbhost\CI\$revision and $webhost\CI\$revision is cleaned up and the folders are deleted.

At this point you should be able to browse your recently deployed web application, to log in and to work with it. Dedicated database named with the revision number is created for each web deployment.

2. Execute Selenium tests

In the previous article I showed how unit tests can be executed and integrated in Jenkins with bat file.

Here is how i did it with PowerShell.

$nunitProjFile = "$p\Artifacts\{0}\Source\Automation\WebProject.SeleniumTests\SeleniumTests.FF.nunit" -f $revision
$outputFile = "$p\Artifacts\{0}\Source\Src\Automation\WebProject.SeleniumTests\console-test.xml" -f $revision
$nunitCmd = "& `"C:\Program Files (x86)\NUnit 2.6.3\bin\nunit-console-x86.exe`" $nunitProjFile /xml:$outputFile"
Write-Host $nunitCmd

Invoke-Expression $nunitCmd
Write-Host "exit code is " $LASTEXITCODE
 
if ($LASTEXITCODE -ne 0)
{
    throw "One of the selenium tests failed. The acceptance stage is compromised and the job ends with error"
}

Before executing the selenium tests, related connection strings must be programmatically modified to point the correct database and web address of the deployed in previous step web application.
Selenium server should be started on the Jenkins machine in order tests to be successfully executed.
When even one of the tests fail, the job is terminated completes as failed.

Related links:

https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin
https://wiki.jenkins-ci.org/display/JENKINS/Email-ext+plugin
http://technet.microsoft.com/en-us/magazine/ff700227.aspx - how to enable PS remoting

Read full article!

Commit Stage in CI Jenkins and Psake

Related posts:
The “COMMIT STAGE” is the second job in my delivery pipeline.
Please, refer the general article for setting up the delivery pipeline with Jenkins and Psake here.
You can take a look on the article dedicated to preceding job in my pipeline “BUILD STAGE” here.
The primary job of “Commit stage” is to quickly validate the artifacts that are archived in the preceding job of the pipeline instance. Its job can be summarized with 2 major activities:
  • Getting the artifacts
  • Testing the artifacts
1. Get the artifacts generated from “BUILD STAGE” job
In order to extract artifacts, “Copy Artifact Plugin” must be installed.
copyArtifactPlugin
In order execution of the tests to work, “COMMIT STAGE” is the first place where the artifacts have to be “extracted” and used.
As first step of build actions in the job definitions, you have to set up the “Copy artifacts from another project” action.

commitstage-buildStep1
“Target directory” will be created if it doesn’t exist. The path can contain tokens and it will be in my case the number of the revision that triggered the pipeline instance. So, within Artifacts, a folder named with the revision name will be created and all artifacts will be copied from Jenkins home folder to it.

Artifact-copied

2. Testing the artifacts
Next actions that have to be executed are:

  • Execute unit tests with NUnit and fail the job if one of the unit tests fail
  • Generate NCover report, regardless if all unit tests execute successful
  • Check code coverage thresholds and fail the job if it is below given value
This screenshot gives understanding of what the actions for covering the above actions are:
commitStage-BuildSteps

  • Ncover.cmd
ECHO %REVISION%

"C:\Program Files (x86)\NCover\NCover.Console.exe" "%ProgramFiles(x86)%\NUnit 2.6.3\bin\nunit-console-x86.exe" "C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\WebProjectTests.nunit" /xml="C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\TestResult.xml" /noshadow  //w "C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests" //ias "Project1.dll" //at  "C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\coverage.trend" //onlywithsource //x "C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\coverage.nccov"

"C:\Program Files\NCover\NCover.Reporting.exe" "C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\coverage.nccov" //lt C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\coverage.trend  //or FullCoverageReport:Html:C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\report_output 

The script executes the unit tests and creates coverage report based on the output.
FullCoverageReport has been generated for Project1.dll.
For details about command syntax refer NCover web site at http://www.ncover.com/
Bear in mind NCover is not a license-free tool.
  • Build.cmd
It’s the same bat file that is already discussed in the previous articles.

The “commitstage.ps1” Psake task copies the NCover report from “WebProject.UnitTests\report_output” to C:\CI\NCoverReports, because that folder is used from Ncover plugin.

Also the report is archived with 7z in PowerShell and the archive is moved to the Artifacts\$revision folder. You can omit this build step, or leave the Psake task empty. It won’t affect the job’s behavior.
  • Nunit.cmd
set OLDDIR=%CD%

IF not EXIST "C:\CI\NunitResult" (
 cd "c:\CI\"
 md NunitResult
)

cd %OLDDIR%

"%ProgramFiles(x86)%\NUnit 2.6.3\bin\nunit-console.exe" C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\WebProjectTests.nunit /xml:C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\output.xml

if defined ERRORLEVEL if %ERRORLEVEL% == 0 goto success else goto fail_build

: fail_build
copy C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\output.xml C:\CI\NunitResult /y
echo 'there are unit tests that failed.'
exit 1

: success
echo 'all unit tests succeeded'
copy C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\output.xml C:\CI\NunitResult /y
exit 0

The script executes the unit tests and copies the result to C:\CI\NunitResult. This folder path is used by NUnit plugin.

The problem is NUnit plugin for Jenkins doesn’t work with relatives to workspace paths, but only with absolute paths.
NUnit report is available for the job’s instance regardless tests succeed or fail.
  • Ncover-threshold.cmd

"C:\Program Files\NCover\NCover.Reporting.exe" "C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\coverage.nccov" //lt C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\coverage.trend  //or MethodModule:Xml:C:\CI\Artifacts\%REVISION%\Source\WebProject.UnitTests\%REVISION%.xml  /e //mc BranchCoverage:95:Module

if defined ERRORLEVEL if %ERRORLEVEL% == 0 goto success else goto fail_build
: fail_build
echo 'Tested modules have branch coverage below the given threshold 95%.'
exit %ERRORLEVEL%
: success
echo 'Tested modules are above the given threshold 95%.'
exit 0
Code coverage 95% has been set up as threshold. If the coverage is less than this value, NCover.Reporting fails with exit code '3'. The job instance fails, which automatically invalidates the pipeline

The post-build actions leverage NCover and NUnit command executions to generate reports available for each of the job’s instance executions


commitStage-PostBuildActions


NCover and Nunit plugins generate reports that are available for each of the run instances


commitStageInstance


At last, the “Commit Stage” triggers next job in the pipeline called “Acceptance Stage”, by post-build step “Trigger parametrized build on other projects”. The context $REVISION is passed as parameter.

CommitStage-TriggersAcceptance


Related Links:

https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin
https://wiki.jenkins-ci.org/display/JENKINS/NUnit+Plugin
http://wiki.hudson-ci.org/display/HUDSON/NCover+Plugin
https://wiki.jenkins-ci.org/display/JENKINS/Email-ext+plugin
http://www.ncover.com/

Read full article!

Build Stage in CI Jenkins and Psake

The build stage is the first job in the delivery pipeline.
Please, refer the general article for setting up the delivery pipeline with Jenkins and Psake here.
The primary job of build stage is to update the source code repository till the latest revision, to build the artifacts and to archive them so they are available to the downstream projects.
Building the artifacts is pretty much building the Visual Studio solutions in your project. Yet this can be not intuitive job to do.
If projects in the solutions that are to be built have post-build or pre-build actions that refer 3rd party software, building job can become very complex.
If you integrate existing solution in CI pipeline, review all related projects’ post and pre-build actions and check relations with 3rd party software. Make sure referred 3rd party software supports command-line invocations.
If you build your CI delivery pipeline for new project from its beginning, choose wisely your 3rd party software that is going to be integrated with the project. It must support command-line invocations.
The first step for setting up the build stage is to define the repository type and to setup its details.
In my case, as you can see from the screenshot below I use “Subversion”. Once you provide the “Repository URL”, authentication link will be displayed so you can enter repository credentials. After they are saved, they can be changed any time by clicking the question mark button next to the repo url field.
defineSVNcredentials
The “buildstage.ps1” can be elaborated, so it can start doing “real” job.
Let’s assume that source content is:
  • Mvc.net project (regardless the version)
Let’s name it WebProject and consider it is part of the WebProject.sln. It will be located in “C:\CI\Source\Project\WebProject\” folder
  • Installer vdproj that outputs deployment package *.zip
Let’s name it ProjectPackager.vdproj
Deployment web package will be outputted in “C:\CI\ Source\Setup\Output\”
  • SSDT DB project (solution)
Let’s name the solution ProjectDB.sln. It is located in “C:\CI\ Source\ProjectDB\”
  • Unit Tests solution
Let’s name the solution WebProject.Tests.sln
The IDE is Visual Studio 2010.
This scenario covers pretty much everything needed to have a web based software solution, which can be rolled out to production environment via installer.
So, the job that build stage does can be split in 3 major actions:
1. Cleanup the solutions and blank folders that keep previous version of deployment packages
Here is the body of the “Clean” Psake task:

task Clean { 


      Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Start Clean task..."
      If ($revision)
    {


        Write-Host $revision
    }
    else
    {
        Write-Host "Revision is EMPTY"
    }


      Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Start Clean task..."


      Write-Host "Clean WebProject.sln..."
      exec { msbuild  "$code_directory\WebProject.sln" /t:Clean  /v:n /nologo}


      Write-Host "Clean ProjectDB.sln..."


      exec { msbuild  "$code_directory\ProjectDB\ProjectDB.sln" /t:Clean  /v:n /nologo}
    
      Write-Host "Clean WebProject bin and obj folders..."


      Remove-IfExists $code_directory\Project\WebProject\bin


      Remove-IfExists $code_directory\Project\WebProject\obj


      #delete packages folder
      Write-Host "Clean Output folder along with packages in it(if exists)..."


      Remove-IfExists $code_directory\Setup\Output


      Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Complete Clean task..."
}


#helper methods
function Create-IfNotExists([string]$name) {
    if (!(Test-Path -path $name)) 
    {
        New-Item -Path $name -ItemType "directory"
    }
}
function Remove-ThenAddFolder([string]$name) {


    Remove-IfExists $name
    New-Item -Path $name -ItemType "directory"
}


function Remove-IfExists([string]$name) {
    if ((Test-Path -path $name)) {
        dir $name -recurse | where {!@(dir -force $_.fullname)} | rm
        Remove-Item $name -Recurse
    }
}

As you can see from the “Clean” task code, solutions can be cleaned command line using msbuild command.

If msbuild is not recognized, please check your environment path variable and make sure it contains the path to the command.

Below are some errors you can face, while trying to accommodate psake scripts

  • error MSB4019:
The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk.

If you spend some time googling for solution, you will find out there are 2 options:
a) Copying the following folder from your development machine to your build server fixes this if it's just web applications
C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplication
b) Download the "Microsoft Visual Studio 2010 Shell (Integrated) Redistributable Package"

This in my option is preferable resolution.

Download and install chocolatey (http://chocolatey.org/) and install the VS2010 redist from http://chocolatey.org/packages/VS2010.ShellIntegratedRedist

After chocolately in installed, install the VS redistributable package with command:

cinst VS2010.ShellIntegratedRedist

After installing the redistributable package:

after-installing VS2010 Redist

2. Build the solutions you need

Here is the body of the “Build” Psake task:
task Build { 
    Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Start Build task..."
 
    Write-Host "Build WebProject.sln in " $config "configuration..."
    exec { msbuild "$code_directory\WebProject.sln" /t:Build /p:Configuration=$config /v:m /nologo}

 
    Write-Host "Build ProjectDB.sln in Debug configuration..."
    exec {msbuild "$code_directory\ProjectDB\ProjectDB.sln" /t:Build /p:Configuration=Debug /v:m /nologo}

    Write-Host "Build WebProject.Tests.sln"
    exec {msbuild "$code_directory\WebProject.UnitTests\WebProject.Tests.sln" /t:Build /p:Configuration=Debug /v:m /nologo}

    Write-Host "Build WebProject web application package..."
    exec {devenv.com "$code_directory\WebProject.sln" /Build "Release|Any CPU" /Project "$code_directory\Setup\ProjectPackager.vdproj" /$config }

    Write-Host -ForegroundColor "Green" -BackgroundColor "White" "Complete Build task..."
}

Msbuild is used for building the solutions and devenv.com is used to build the vdproj project types.

Make sure that the path to devenv.com is part of the global environment PATH variable.

Unfortunately there is no way to build vdproj without installing Visual Studio 2010 on the build environment. Regardless how bad practices this is, there is no way to overcome the restrictions.

Alternative is to use Wix (http://wix.codeplex.com/ ) and InstallShield, which are the new installer projects since VS2012.

Make sure you have all 3rd parties that are referenced from the GAC installed on the build machine. That might be various reporting tools, loggers, etc…

Below is a list of common build errors that you may face during working the “BUILD STAGE” job in Jenkins:
  • Windows SDK required
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets(2342,9): error MSB3086: Task could not find "AL.exe" using the SdkToolsPath "" or the registry key

"HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A". Make sure the SdkToolsPath is set and the tool exists in the correct processor specific location under the SdkToolsPath and that the Microsoft Windows SDK is installed

Resolution:
http://www.microsoft.com/en-us/download/confirmation.aspx?id=8279
http://stackoverflow.com/questions/2731365/running-msbuild-fails-to-read-sdktoolspath

  • Solution file error MSB5009
Solution file error MSB5009: Error parsing the nested project section in solution file.

Resolution:
Regenerate your solution – create empty folder, delete it, rebuild.

Bear in mind that if you have to install SVN client on the build machine for a reason (it might be required for example by the usage of SubWCRev in post-build event), the version of the subversion client depends the version of the SVNkit that Jenkins plugin supports.

“Subversion Workspace Version” in “Manage Jenkins” -> “System Configuration” correspond to SVNkit version in the installed subversion plugin. (1.7 in this case)

subversionplugin-svnclient

3. Define what the artifacts are and archive them

That is definitely the most important part of the build stage in your delivery pipeline. It’s crucial all actions in the delivery pipeline instance to be executed against the very same artifacts if we don’t want to compromise the output of the pipeline.

Once they are built (only once for the instance), they have to be shared with Jenkin’s jobs. There are various artifact repositories over the network like Nexus, Artifactory, etc. They all turned out to be hard to implement with .Net project (at least to me).

That’s why I chose the built in capability for archiving artifacts that Jenkins has. It stores the artifacts on the file system. All paths on the screenshot are relative to the job’s workspace path (“C:\CI”).

ArchiveArtifacts


The archived artifacts are available for preview for each of the executed/completed job’s instances.

BuildStageInstance


At this stage, a build can be triggered through commit change in the source control management. Project’s solutions are cleaned and rebuilt, artifacts are identified and archived. They are accessible for each of the instances of “BUILD STAGE” job history. Email is sent on success or failure of the build job.

Related links:
https://wiki.jenkins-ci.org/display/JENKINS/Email-ext+plugin (Email notifications plugin)
http://msdn.microsoft.com/en-us/data/tools.aspx (SSDT for Visual Studio)
http://wix.codeplex.com/

Read full article!