Continuous Delivery
Continuous releasing of Maven artifacts 
Marcus Martina 30 Sep, 2012
rm -rv ${JENKINS_HOME}/maven-repositories/*For the slaves the variable JENKINS_HOME in this expression need to be replaced by the applicable Jenkins home directory. The NodeLabel Parameter Plugin can be used to assign the cleanup jobs to the specific nodes and the Heavy Job Plugin can be used to allocate all available executors on that node in order to ensure exclusive access to all the local repositories. Now lets create the actual pipeline by using the Parameterized Trigger Plugin which enables us to pass predefined parameters to downstream jobs. In this way we can propagate a single pipeline revision number from the first job throughout the pipeline. Lets say Subversion is used as our version control system. In that case we can predefine the pipeline revision number parameter PL_SVN_REVISION as being the built-in variable SVN_REVISION:
PL_SVN_REVISION = ${SVN_REVISION}From all downstream jobs the pipeline parameter can simply be propagated by enabling the current build parameters feature. The PL_SVN_REVISION parameter can be used in downstream jobs to checkout revision specific code by adding @${PL_SVN_REVISION} to the Repository URL. This could be code that is needed to bootstrap an integration test or smoke test or to perform a live deployment. This code is preferably just a single POM file or a small module. It does not even need to contain actual test scripts or fixtures. These can simply be packed as an artifact in the first job and just like other artifacts be reused in downstream jobs. The Jenkins Maven Repository Server Plugin is a great tool as it exposes a single job build as a Maven repository containing all the artifacts that are archived as part of that build. This makes it very easy and efficient to reuse these specific artifacts in downstream jobs via the usual Maven dependency mechanism. There is no longer a need to let Maven deploy artifacts to a separate Maven repository manager like Nexus as Jenkins has become fully self-sufficient. Furthermore only Jenkins itself is capable of providing the specific artifacts that belong to a specific pipeline. The Jenkins Maven Repository Server Plugin lets us define an upstream job build as a Maven repository. Although only the last successful build is supported out of the box, with a little trick it is still possible to select a different upstream job build, more specifically the build that created the artifacts that need to be reused within the pipeline. For that purpose let us predefine another pipeline parameter PL_CREATE_BUILD as being a combination of the built-in variable JOB_NAME and the built-in variable BUILD_NUMBER:
PL_CREATE_BUILD = ${JOB_NAME}/Build/${BUILD_NUMBER}This pipeline parameter can be propagated from the first job throughout the pipeline together with the PL_SVN_REVISION parameter. Now the specific path ${PL_CREATE_BUILD}/repository actually denotes the correct upstream Maven repository. This expression can be chosen as a specified path in repository when defining an upstream maven repository. But because it is an expression the Jenkins parse POMs phase will fail. The solution is to use the Environment Injector Plugin, which makes it possible to inject any environment variable into the build process. The Jenkins Maven Repository Server Plugin defines an environment variable Jenkins.Repository under the hood when defining an upstream maven repository. Lets instead inject this environment variable explicitly as property content with the Environment Injector Plugin:
Jenkins.Repository = ${JENKINS_URL}plugin/repository/project/${PL_CREATE_BUILD}/repositoryAs documented by the Jenkins Maven Repository Server Plugin this environment variable can be used to specify a Maven repository ${env.Jenkins.Repository} in a jenkins profile in the Maven settings.xml file. To ensure that the executor specific local Maven repository is updated with the snapshot versions of the artifacts created in the first job of the pipeline, it is necessary to set the snapshots updatePolicy to always. After all the different steps of a pipeline are not tied to a specific node or executor, which is the main reason to use the Maven Repository Server Plugin at all. Please note that it can occur that a build executed by an executor is actually based on an older pipeline than the preceding build executed on the same executor. In that case the snapshot artifacts in the local Maven repository actually need to be replaced by older versions, meaning artifacts with older timestamps. Fortunately this appears to be the default behavior of Maven. So there is no pre-build step required that does a cleanup of the local repository in order to guarantee a true pipeline. Now sometimes build steps from different pipelines cannot be executed concurrently because exclusive access to the same resources is required. In case it concerns only a single job it is sufficient to make sure that the Jenkins built-in feature execute concurrent builds if necessary is disabled for this job, which is the default behavior anyway. In cases where different jobs are involved they need to be throttled explicitly. For instance deploying artifacts to a live server and running a smoke test for artifacts already deployed on the same server can obviously not be executed concurrently. The Throttle Concurrent Builds Plugin can be used to define throttle categories and restrict concurrent execution of jobs by assigning them to the same throttle category. In addition, sometimes build steps of different pipelines cannot be executed in a random order. For instance deploying artifacts of a pipeline to a live server cannot be succeeded by deploying artifacts of another pipeline earlier than a smoke test has run for the already deployed artifacts. This can be guaranteed by assigning a higher priority to the smoke test job using the Priority Sorter Plugin. Sometimes a step in the pipeline might fail because of some technical error that is not related to the associated revision. In order to trigger a rebuild of the failed downstream job the pipeline parameters PL_SVN_REVISION and PL_CREATE_BUILD need to be specified manually, which is a bit awkward. Here the Rebuilder Plugin comes in handy, which facilitates rebuilding a job with the same parameters as the failed build. Alternatively, the Build Pipeline Plugin, which provides a nice visualization of the most recent pipelines, can also be used to manually retrigger a failed build. Unfortunately, the current version does not visualize the revision numbers of the pipelines. In order to visualize the actual revision numbers within Jenkins the Build Name Setter Plugin can be used instead. This makes it easier to identify builds by revision number instead of by build number. For the first job one can set the build name as
#${BUILD_NUMBER} - rev ${ENV,var="SVN_REVISION"}And for downstream jobs the variable SVN_REVISION in this expression need to be replaced by the pipeline parameter PL_SVN_REVISION, as these can have different values.