Create a Multi-Job Pipeline
This tutorial demonstrates how to create a sequence of dependent Jobs and pass files produced by one Job into another.
Create a Pipeline
Fork the sample Gradle/Docker Pipeline repository to your GitHub account. This repository contains a Gradle project that builds a Spring Boot application, and a Dockerfile for making a Docker image.
Click New Pipeline... and wait for TeamCity to use the OAuth connection you provided to scan a VCS and display the list of available repositories. Click the forked repository to create a new Pipelines project.
Open the Job settings, expand the automatically created Step 1 and change its type from Script to Gradle.
Type
clean build -x test
in the Tasks field.Gradle recommends running builds with the help of the Gradle Wrapper — a script that allows builds to utilize the specific version of Gradle and, if no such version is present, download it before the target build starts. Ensure the Use Gradle Wrapper option in the Step settings section is enabled to tell TeamCity it should look for the Wrapper inside the repository.
Enter
build/libs/
to the Artifacts field to tell TeamCity it should publish this folder with the resulting .jar file when the Gradle Step finishes.Save and run your Pipeline and ensure it finishes successfully. Note that published files are now available from the Artifacts tab of the build results page.
Add Simultaneously Running Tests
Our sample project has two modules with tests, each with its own set of Gradle instructions inside build.gradle
files. With Pipelines, you can add different Jobs that run these test sets simultaneously.
On the main page of your Pipeline settings, click Add to create a new Job.
In the Dependencies section of the new Job's settings, tick a checkbox next to the first Job's name. By doing so you specify that the new Job should start only after the first Job finishes. Running tests does not require any files produced on the building stage of our Pipeline, so you can switch the selector to Ignore artifacts.
In Step settings, choose Gradle as the Step type and type
clean test
in the Tasks field.All new Steps have "repository root" as the default Working directory setting. A working directory is a path on a build agent's local storage that stores files fetched from a remote repository. Whenever you run a Step, it uses files from this local repository copy. If you need to point a Step to the specific directory instead of the entire repository root folder, specify a required directory path (relative to the root directory). In our case, type
test1
in the Working directory field.To manually select a Gradle build file, type a path to this file in the corresponding field. Note that the path must be relative to the current working directory. For instance, since our working directory is “test1”, you can type
build.gradle
and TeamCity will use the correcttest1\build.gradle
file rather than a similar file from the project root folder that Job #1 utilizes.Similarly to the first created Job, ensure the Use Gradle Wrapper setting is on to let TeamCity look for Wrapper files inside the "test1" directory. The Gradle wrapper path should remain empty since this file resides at the default location.
Repeat steps 1 to 6 to create another Job, but use
test2
instead oftest1
when setting the Working directory. This directory contains its ownbuild.gradle
and Gradle Wrapper files, so other settings should remain the same.You now have two Jobs that can start as soon as the first Job finishes building the application. The visual graph splits the sequence of Jobs into separate branches to illustrate this setup. Save and run your Pipeline to ensure both testing Jobs run their Steps simultaneously.
Add a Final Job
Now that your project is built and tested, we can add a final Job that uses a file produced by the first Job to build and publish a Docker image.
On the main Pipeline page, click Add... to create a new Job.
Since our new Job should be the last one to run, tick both testing Job names under the Dependencies section.
Selecting the two testing Jobs in Dependencies is enough to put our new Job to the end of the Pipeline. However, to build a Docker image it requires a .jar file produced by the very first Job. To give our new Job access to this file, tick the first Job in Dependencies and ensure the Use artifacts mode is on. Adding this last dependency does not change the order in which Jobs are launched, but ensures the file from Job #1 is shared with Job #4.
Leave the Step type as Script and type the following command as the script body:
docker build --pull --file ./docker/Dockerfile --tag myusername/myrepositoryname:mycustomtag .
. This script runs thedocker build
command with the following parameters:--pull — Attempts to pull a newer version of the image.
--file — The path to the Dockerfile.
--tag — The name for your build Docker image.
. — Sets the agent checkout directory as the context for the
docker build
command.
To publish your built image, use the following command:
docker push myusername/myrepositoryname:mycustomtag
. You can create another Job that runs it, add a separate Step to the final Job, or simply add this line to the existing Step.If you run the Pipeline now, it will fail because of the
docker push
command: you can pull public images anonymously, but to push images you need to be logged in. To do so, you need to provide credentials of a user with the "Write" repository permission.In Job settings, expand the Integrations section, click Add | Docker Repository, and enter valid user credentials.
Save the Pipeline and run it. Check your DockerHub repository to ensure the last Job uploads a Docker image.
Optional: Add Parameters
You now have a Pipeline that builds and tests your sample application, builds a Docker image from it, and publishes this image to the Docker Hub. As an additional customization, you may want to replace plain string values in your Script step with references to parameters.
Open Pipeline settings and add two new Parameters:
DImageName
— stores the name of your Docker image.DRepoName
— stores your Docker Hub registry name in theyour_user_name/repository_name
format.
You can now avoid repetitively entering image and registry names, and instead use the
%parameter_name%
syntax to reference parameters that store these values. In our case, you can modify the script of the last Job.docker build --pull --file ./docker/Dockerfile --tag %\DRepoName%:%\DImageName%-%\build.number% . docker push %\DRepoName%:%\DImageName%-%\build.number%Note that the updated script uses a reference to the
build.number
parameter. This is a predefined TeamCity parameter that stores the number of the current build.
YAML Configuration
The final Pipeline should have the following YAML configuration. You can go to Pipeline settings, switch the editing mode from Visual to YAML, and compare your current settings with this reference configuration to check whether some of your settings are missing or have different values.