This GitLab CI/CD pipeline consists of ten stages - Quality, Get-version, Docker-hub-build, Docker-hub-test, Docker-hub-pushtag, Docker-hub-build-arm, Docker-hub-test-arm, Docker-hub-pushtag-arm, Test, and Docs.
In the order of the stages, the jobs are:
hadolint: This job is for the Quality stage. It
checks your Dockerfile to ensure it adheres to the best practices and
industry standards.
getsquid_vars: This job belongs to the Get-version
stage, and it reads the latest squid version from GitHub, updates the
README.md file with the version, and pushes the changes to the master
branch.
docker-hub-build: In the Docker-hub-build stage,
this job builds the Docker image and pushes it to Docker Hub.
docker-hub-test: This job tests the Docker image by
attempting to access a website through the Squid proxy.
SquidParseConfig: It checks Squid’s configuration
for any errors.
dive: This job displays a report detailing each
layer in the Docker image.
push-docker-hub: In the Docker-hub-pushtag stage,
this job tags and pushes the Docker image to Docker Hub.
docker-hub-build-arm,
docker-hub-test-arm, and
docker-hub-pushtag-arm: These jobs mirror the build, test,
and push jobs but are for an ARM architecture Docker image.
chatgpt_analysis: This job uses Open AI’s ChatGPT to
generate a Markdown file explaining all jobs in this pipeline.
update_dockerhub_readme: This job updates the
README.md file on Docker Hub with the content of the current project’s
README.md file.
hadolint - This job is to check Dockerfile adherence to
best practices and industry standards. This is done with a simple
hadolint command in the script block;
hadolint --ignore DL3008 Dockerfile.hadolint:
image: hadolint/hadolint:latest-debian
stage: Quality
before_script:
- cd $CI_PROJECT_DIR
script:
- hadolint --ignore DL3008 Dockerfile getsquid_vars - Fetches the latest Squid version,
updates README.md with the version and pushes the changes.getsquid_vars:
stage: Get-version
image:
name: $CONTAINER_CLIENT_IMAGE
artifacts:
paths:
- variables.env
script:
- echo "SQUID_VERSION=$SQUID_VERSION" > variables.env
-
- git add README.md
- git push docker-hub-build - Logging into Docker Hub and creating
Docker images for each architecture.docker-hub-build:
stage: Docker-hub-build
image: docker:dind
before_script:
- docker login -u "$DOCKER_HUB_USER" -p "$DOCKER_HUB_TOKEN" $DOCKER_HUB_REGISTRY
script:
- docker build --build-arg SQUID_VERSION=$SQUID_VERSION --pull -t $CONTAINER_BUILD_NOPROD_NAME_AMD64 .
- docker push $CONTAINER_BUILD_NOPROD_NAME_AMD64docker-hub-test - Testing the Docker image by
attempting to access a website through the proxy.docker-hub-test:
stage: Docker-hub-test
extends: .services-amd64
script:
- export https_proxy=http://$CONTAINER_TEST_NAME:3128 && curl -k https://www.google.fr
variables:
HOSTNAME: squidpipeline
needs: ["docker-hub-build"]SquidParseConfig - Checks Squid’s configuration for any
errors.SquidParseConfig:
stage: Docker-hub-test
script:
- /usr/sbin/squid -k parse /etc/squid/squid.confdive - Displays a report detailing each layer in the
Docker image.dive:
stage: Docker-hub-test
script:
- docker pull $CONTAINER_BUILD_NOPROD_NAME_AMD64
- dive $CONTAINER_BUILD_NOPROD_NAME_AMD64
variables:
CI: "true"push-docker-hub - Tags the Docker image with the
version and pushes to Docker Hub.push-docker-hub:
stage: Docker-hub-pushtag
image: docker:dind
script:
- docker pull $CONTAINER_BUILD_NOPROD_NAME_AMD64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker push $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:latest-amd64
- docker push $HUB_REGISTRY_IMAGE:latest-amd64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:latest
- docker push $HUB_REGISTRY_IMAGE:latest
only:
- masterdocker-hub-build-arm,
docker-hub-test-arm, and
docker-hub-pushtag-arm mirror the build, test and push jobs
for an ARM architecture Docker image.
chatgpt_analysis: This job generates a Markdown file
explaining all jobs in this pipeline using Open AI’s ChatGPT. To do
this, it constructs a detailed request for ChatGPT that contains a
description of each job. This request is then sent to ChatGPT, which
returns a detailed explanation of each job. The response from ChatGPT is
then stored in a Markdown file.
chatgpt_analysis:
stage: Docs
script:
- CONTENT="description...."
- JSON_CONTENT=$(jq -n --arg model "gpt-4" --arg content "$CONTENT" '{model:$model, messages:[{role:"user", content:$content}] }')
- RESPONSE=$(curl -X POST https://api.openai.com/v1/chat/completions -H "Authorization:Bearer $CHATGPT_API_KEY" -H "Content-Type:application/json" -d "$JSON_CONTENT")
- echo "$ANSWER"
- echo -e "$RESPONSE" > chatgpt_analysis_$(date +%Y%m%d).mdupdate_dockerhub_readme: This job updates Docker Hub’s
README file with the README.md file contents. It does this by sending a
PATCH request to Docker Hub’s API.update_dockerhub_readme:
stage: Docs
script:
- README_CONTENT=$(cat README.md)
- PAYLOAD=$(jq -n --arg desc "$README_CONTENT" '{"full_description":$desc}')
- curl -X PATCH -H "Authorization:JWT $TOKEN" -H "Content-Type:application/json" -d "$PAYLOAD" https://hub.docker.com/v2/repositories/$HUB_REGISTRY_IMAGE
only:
- masterEach job has some environment variables defined to facilitate various operations. Here are some of them:
GIT_CLONE_PATH: It refers to the directory where the
repository should be cloned.CONTAINER_CLIENT_IMAGE: It refers to the docker image
used for running a majority of tasks in the pipeline.CONTAINER_TEST_NAME: Name of the container.CONTAINER_BUILD_NOPROD_NAME_AMD64 and
CONTAINER_BUILD_NOPROD_NAME_ARM: These refer to the Docker
images built for each architecture.DOCKER_HUB_USER and DOCKER_HUB_TOKEN:
These are the Docker Hub login credentials, used in build and push jobs
to log in to Docker Hub.SQUID_VERSION: This refers to the latest squid version
fetched from GitHub.This pipeline takes advantage of GitLab CI’s needs
keyword, which expresses dependencies between jobs. This makes jobs run
as soon as their dependencies are finished, instead of waiting for all
jobs from prior stages to finish.
For instance, chatgpt_analysis job needs
getsquid_vars, docker-hub-test, and
docker-hub-test-arm jobs to complete before starting.
Similarly, docker-hub-build and
docker-hub-build-arm need getsquid_vars to
complete for them to start.
The pipeline generates various artifacts, log reports, and tests results which would be used for subsequent stages:
getsquid_vars generates an artifact containing the
latest Squid version (variables.env) used in the following
jobs for building Docker images.docker-hub-build and docker-hub-build-arm
create Docker images for amd64 and arm respectively.docker-hub-test and docker-hub-test-arm
test the Docker images built in the previous step.push-docker-hub and push-docker-hub-arm
push the Docker images to Docker Hub.chatgpt_analysis generates a Markdown file that
provides a detailed explanation of each job in the pipeline.The latest commit auto-updates the README file with the latest squid version from GitHub. This commit was necessary to keep the README file up-to-date with the current Squid version in use, ensuring everything reflects the latest version.
This commit affects the pipeline by triggering the
update_dockerhub_readme job, which ensures the README in
Docker Hub is kept in sync with the project’s README file.