The continuous integration and deployment (CI/CD) pipeline described is for an open source squid proxy server project hosted in GitLab and deployed in Docker.
Here is a list of all the jobs in their order of declaration on the
stages section of the .gitlab-ci.yml file:
quality: This job checks the Dockerfile quality using
the hadolint linter.Docker-hub-build: This job builds Docker images for
both ARM and AMD processors.Docker-hub-test: Conducts some tests against both the
built ARM and AMD images to ensure they are functional.Docker-hub-pushtag: Pushes the newly created Docker
images (both ARM & AMD) to Docker Hub.test: Runs SAST (Static Application Security Testing)
against the source code.chatgpt: Carries out a ChatGPT analysis to automate the
factors in the README in Markdown format.Each job in the CI/CD pipeline serves a specific purpose, which we’ll break down step-by-step:
qualityThis job utilizes the hadolint linter to audit and
analyze the Dockerfile for best practices and vulnerabilities.
hadolint:
image: hadolint/hadolint:latest-debian
stage: quality
before_script:
- cd $CI_PROJECT_DIR
script:
- hadolint --ignore DL3008 Dockerfile Docker-hub-buildIn this job, two Docker images are built; one for ARM (docker-hub-build-arm) and one for AMD (docker-hub-build) processors. Docker images are built from a Dockerfile which includes the squid proxy server.
docker-hub-build:
image: docker:dind
artifacts:
before_script:
- docker login -u "$DOCKER_HUB_USER" -p "$DOCKER_HUB_TOKEN" $DOCKER_HUB_REGISTRY
script:
- export SQUID_VERSION=$(curl -s http://www.squid-cache.org/Versions/v6/ | egrep -m 1 -oh squid-.*.tar.gz | cut -d '"' -f1)
- docker build --build-arg SQUID_VERSION=$SQUID_VERSION --pull -t $CONTAINER_BUILD_NOPROD_NAME_AMD64 .
- docker push $CONTAINER_BUILD_NOPROD_NAME_AMD64Docker-hub-testThis stage conducts tests against the Docker images to ensure they are built correctly and functional. It does this by running basic CURL commands to ensure network connectivity of the containers.
docker-hub-test:
stage: Docker-hub-test
extends: .services-amd64
script:
- export https_proxy=http://$CONTAINER_TEST_NAME:3128 && curl -k https://www.google.frDocker-hub-pushtagThis job pushes the successfully built Docker images to Docker Hub. Each image is tagged with both the squid version as well as the architecture type (i.e., AMD64 or ARM).
push-docker-hub:
stage: Docker-hub-pushtag
image: docker:dind
before_script:
- docker login -u "$DOCKER_HUB_USER" -p "$DOCKER_HUB_TOKEN" $DOCKER_HUB_REGISTRY
script:
- docker pull $CONTAINER_BUILD_NOPROD_NAME_AMD64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker push $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:latest-amd64
- docker push $HUB_REGISTRY_IMAGE:latest-amd64testThe test job runs static application security testing
(SAST) on the codebase to detect possible vulnerabilities.
chatgptThe chatgpt job automates the generation of README like
explanations using GPT3 natural language generations.
chatgpt_analysis:
stage: chatgpt
image:
name: $CONTAINER_CLIENT_IMAGE
artifacts:
expire_in: 1 month
paths:
- $CI_PROJECT_DIR/chatgpt_analysis*
script:
- export SQUID_VERSION=$(curl -s http://www.squid-cache.org/Versions/v6/ | egrep -m 1 -oh squid-.*.tar.gz | cut -d '"' -f1 | sed 's/\.tar\.gz//g' | sed 's/squid-//g')
- JOBS_CONTENT=$(cat .gitlab-ci.yml gitlabci/*)
- LAST_COMMIT=$(git log -1 --pretty=format:"%h %s%n%b")
- JSON_CONTENT=$(jq -n --arg model "gpt-3" --arg content "$CONTENT" '{model:$model, messages:[{role:"user", content:$content}] }')
- RESPONSE=$(curl -X POST https://api.openai.com/v1/chat/completions -H "Authorization:Bearer $CHATGPT_API_KEY" -H "Content-Type:application/json" -d "$JSON_CONTENT")GitLab pipelines utilise several environment variables, which could include file paths, or any confidential information such as tokens or passwords. It’s essential to approach them with caution as they could contain sensitive data. Here are some examples:
$CI_PROJECT_DIR: The full path where the repository is
cloned and where the pipeline is run.$DOCKER_HUB_USER , $DOCKER_HUB_TOKEN,
$DOCKER_HUB_REGISTRY: These are the Docker Hub credentials
required to log in and push the new Docker images to Docker Hub.$SQUID_VERSION: The version number of squid to be used
for executing the jobs, obtained through a CURL command against
http://www.squid-cache.org/Versions/v6/$HUB_REGISTRY_IMAGE,
$CONTAINER_BUILD_NOPROD_NAME_AMD64,
$CONTAINER_BUILD_NOPROD_NAME_ARM: Docker image names for
different versions and processors.Jobs are dependent on each other through the needs
keyword. For instance, in the docker-hub-test job, the
directive needs: ["docker-hub-build"] states that it
depends on the docker-hub-build job. This means it will
only run after docker-hub-build job has executed
successfully.
docker-hub-test:
stage: Docker-hub-test
needs: ["docker-hub-build"]The needs keyword also allows jobs to run in parallel
whenever possible, helping to reduce pipeline completion time.
Each job in the pipeline produces results that could be used in future jobs or can be stored as an artifact for later retrieval.
quality job produces lint results that are visible in
pipelines logs.Docker-hub-build and Docker-hub-build-arm
jobs produce Docker images.Docker-hub-test and Docker-hub-test-arm
jobs produce test results, available in pipeline logs.Docker-hub-pushtag and
Docker-hub-pushtag-arm jobs push the images to Docker Hub.
Their results can be seen by checking Docker Hub.chatgpt produces a markdown file with explanations,
stored as a GitLab artifact.The latest commit’s has removed the md file to a website. This likely implies that the results previously stored in a markdown file will now be found hosted on a website. This enhances the visibility of the generated content and makes it more accessible to users or contributors who do not have access to the GitLab repository or artifacts.