Set up a CI/CD Pipeline to deploy to Kubernetes with Azure DevOps

Azure DevOps provides the best services for Continuous Integration and Delivery and setting up a pipeline for Kubernetes is important as there are additional steps to a normal web app deployment. This article will guide you through the application that we are going to use, building a pipeline that publishes new versions of a microservice application, using Docker, Azure Pipelines as a Code and the final details in Azure DevOps.

Application Architecture

The source code is found in the branch azure-dev-ops branch of the Kubernetes mastery repository. The sample application contains the following services:

  • SA-Feedback: An ASP.NET Core application with an SQLite3 database.
  • SA-Frontend: A ReactJS client served by a Nginx web server.
  • SA-Logic: A python flask application that analyzes the sentiments of sentences.
  • SA-WebApp: A Java web application based on spring.

Maintaining a diverse set of services is an effort for every DevOps team, because of interdependency conflicts, keeping tools updated, updating the built environment when the application changes etc. But using Docker as the build tool itself removes the difficulties, as every service will have its own self-defined environment. To get a better idea of how Docker solves this issue let’s check the Dockerfile of SA-Frontend below, where we defined the image node:alpine as the build environment if other Node applications need their own environment they will define it in their Dockerfile:

FROM node:alpine as builder
COPY package.json /sa-frontend/
WORKDIR /sa-frontend
RUN npm install
COPY . .
RUN npm run build

FROM nginx:1.15-alpine
COPY --from=builder /sa-frontend/build /usr/share/nginx/html
EXPOSE 80

 This is a Multi-stage Docker build that in:

  • Stage 1: Starts from a Node image needed to build the React application and generate the static files and,
  • Stage 2: Starts from a Nginx image that suffices to serve the static files which are copied from the previous stage.

Note: The other services are built in the same manner.

Azure Build Pipelines – The Continuous Integration service

Continuous Integration is the process that takes Source Code, executes tests, checks code quality, and produces an executable version of the application (jar, dll, docker image etc.), for each code change. As the build pipeline is tightly coupled to the source code, changes in the source code will require changes in the build pipeline, many CI services enable having the pipeline defined as a special file in the source code and as such versioned together with the source code.

In Azure Pipelines this special file is named azure-pipelines.yml and must be in the root of the repository as seen in the repo. The azure pipelines file is shown below:

trigger:
- azure-dev-ops      # 1

variables:           # 2
  sa-frontend-image-name: '$(dockerId)/sentiment-analysis-frontend:$(build.buildId)'
  sa-web-app-image-name: '$(dockerId)/sentiment-analysis-web-app:$(build.buildId)'
  sa-logic-image-name: '$(dockerId)/sentiment-analysis-logic:$(build.buildId)'
  sa-feedback-image-name: '$(dockerId)/sentiment-analysis-feedback:$(build.buildId)'

jobs:                # 3
- job: SA_Frontend       # 3.A
  steps:             # 4
  - script: docker login -u $(dockerId) -p $(password)
    displayName: 'docker login'
  - script: docker build -f ./sa-frontend/Dockerfile -t $(sa-frontend-image-name) ./sa-frontend
    displayName: 'docker build frontend'
  - script: docker push $(sa-frontend-image-name) 
    displayName: 'docker push frontend'

- job: SA_WebApp         # 3.B
  steps:
  - script: docker login -u $(dockerId) -p $(password)
    displayName: 'docker login'
  - script: docker build -f ./sa-webapp/Dockerfile -t $(sa-web-app-image-name) ./sa-webapp
    displayName: 'docker build webapp'
  - script: docker push $(sa-web-app-image-name) 
    displayName: 'docker push webapp'
# Shortened for brevity

Explanations of the points above:

  1. Trigger defines the branch that will trigger the build in Azure DevOps Pipelines.
  2. Variables needed in the continuation of the configuration.
  3. Jobs an array of Jobs that will be executed parallelly, 3.A and 3.B are Job instances of this array.
  4. Steps to be executed by the job. In this case, we are logging in the with the docker user, building the image specified in the Dockerfile and pushing this image to docker hub.

In the script above we tag the docker images with the Build Id (a common practice as it enables to infer the last commit by checking the Tag), to make use of those you have to update the Kubernetes Manifests found in the folder resource-manifets  with the current Build Id.

This is done by replacing the placeholders:

  • From:  image: {{DOCKER_USER}}/sentiment-analysis-frontend:{{TAG}}
  • To: image: rinormaloku/sentiment-analysis-frontend:1

And then publishing as artifacts, this is achieved by the job below (included in the azure-pipelines.yml):

- job: Publish_Artifacts
  steps:
  - bash: find resource-manifests -type f -name "*.yaml" -print0 | 
      xargs -0 sed -i -e "s/{{TAG}}/$(build.buildId)/" \   
      -e "s/{{DOCKER_USER}}/$(dockerId)/"
  - task: [email protected]
    inputs:
      pathtoPublish: '$(System.DefaultWorkingDirectory)/resource-manifests'
      artifactName: resource-manifests

Now we need to integrate Azure DevOps.

Setting up Azure DevOps to use the defined Pipeline

To continue with this article, you need Azure DevOps (register for free), and you need to fork the sample app repository.
Continue by creating a new project in Azure DevOps and under Pipelines, Builds click New and from the dropdown select New Build Pipeline.
You will be forwarded to a guided tour:

  • Where is your code? Select GitHub.
  • Select a repository: Connect your Azure DevOps with your user and select the repository $GITHUB_USER/k8s-mastery.
  • Choose a template: Select the recommended template, we will change later.
  • Save and run: Click Save and run, select Create new branch, and click Save and run again.

Cancel the build as we need to edit the pipeline, click the three dots and select Edit pipeline.

Azure Pipelines

Click Edit in the visual designer and under "Get Sources" change to the branch azure-dev-ops containing our Azure Pipeline definition.

Switching Branch

 Navigate to Variables and add your Docker User Id and your Password (needed to publish to docker hub) as seen in the Image below. 

 Variables

Click Save & queue and wait for the jobs to complete. 

Build Artifacts

The artifacts of this build are the Docker Images (check out your docker hub repository) and the updated Kubernetes Resource Manifests, as can be seen in the Artifacts explorer in Azure DevOps:

Artifacts

These are environment independent, in the next section, we will make use of these in our Azure Release Pipeline.

Azure Release Pipelines – The Continuous Delivery service

Continuous Delivery is the process of taking the Artifacts produced from Continuous Integration and deploying them automatically to a live environment (e.g. Dev, Staging, Production).

To create a release pipeline navigate to Pipelines, then under Releases, click New and from the drop-down click New release pipeline. From the Templates select Deploy to a Kubernetes cluster and click Apply, name the stage to let’s say Development.

On the Dashboard, click Add an artifact, select the Source type Build and select your project and the build pipeline we created earlier.

To update the development stage click the link below the name as seen on the image below:

Development

By default you get a Kubectl apply task which you need to update:

  1. Connect to your Kubernetes Cluster in the field Kubernetes service connection
  2. Click the checkbox Use Configuration Files
  3. In the Configuration File field select the resource-manifests directory.

In the end, it should look like in the image below:

Task

Save the changes and trigger the build!

For the application to work, you need to install an Ingress Controller as shown in our blog "Ingress Controller - simplified routing in Kubernetes".

  • Erstellt am .
Copyright by Orange Networks GmbH