The pipeline in the .gitlab-ci.yml file is composed of several jobs, each with a definite role in the workflow. The order of the jobs as determined by the ‘stages’ section of the file is as follows:
Quality Stage : The quality stage consists of the
hadolint job which is used to ensure the Dockerfile follows
the best practices.Get-version Stage : This stage only contains one job:
getsquid_vars. Its purpose is to fetch the latest version
of Squid.Docker-hub-build Stage : In this stage, there are two
jobs: docker-hub-build and
docker-hub-build-arm. These jobs build the Docker images
for AMD64 and ARM architectures respectively.Docker-hub-test Stage : This stage comprises of two
jobs: docker-hub-test and docker-hub-test-arm.
They test the built Docker images for AMD64 and ARM architectures
respectively.Docker-hub-pushtag Stage : In this stage, the Docker
images for AMD64 and ARM architectures are tagged and pushed to the
Docker Hub registry by push-docker-hub and
push-docker-hub-arm jobs.Docs Stage : This stage includes
chatgpt_analysis and update_dockerhub_readme
jobs which generate documentation and updates Docker Hub repository’s
readme.The hadolint job is used to run the hadolint linter against the Dockerfile in the CI environment. This job makes sure that Dockerfile follows the best practices and any lint errors are shown in the pipeline log.
hadolint:
image: hadolint/hadolint:latest-debian
stage: Quality
before_script:
- cd $CI_PROJECT_DIR
script:
- hadolint --ignore DL3008 Dockerfile image specifies the Docker image to use for this job,
hadolint/hadolint:latest-debian is a Docker image with a
preinstalled Hadolint, a Dockerfile
linter.stage specifies that this job belong to the
Quality stage.before_script prepends a directory change command to
all script lines that are defined in the script
parameter.script executes the hadolint linter against the
Dockerfile. --ignore DL3008 tells hadolint to suppress rule
DL3008.By running hadolint as a part of the pipeline, the team can discover potential bugs or irregularities in Dockerfile early in the workflow.
The getsquid_vars job is responsible for fetching the
latest version of Squid and persisting it in a file for later jobs to
consume.
getsquid_vars:
stage: Get-version
image:
name: $CONTAINER_CLIENT_IMAGE
artifacts:
expire_in: 1 hour
paths:
- variables.env
script:
- apt update && apt install git curl ca-certificates -y --no-upgrade --no-install-recommends --no-install-suggests
- export SQUID_VERSION=$(curl -LsXGET https://github.com/squid-cache/squid/releases/latest | grep -m 1 "Release" | cut -d " " -f4 |tr -d 'v')
- echo "SQUID_VERSION=$SQUID_VERSION" > variables.env
- echo $SQUID_VERSION
- sed -i "s/{{SQUID_VERSION}}/$SQUID_VERSION/g" README_template.md
- sed -i "s/{{DATE}}/$(date +%Y%m%d)/g" README_template.md
- cp README_template.md README.md
- git config user.email "fredbcode"
- git config user.name "fredbcode"
- git add README.md
- git commit -m "README Auto update [skip ci]" || true
- git push https://$GITLAB_TOKEN@gitlab.com/fredbcode-images/squid.git HEAD:master || trueartifacts specifies which files should be kept and
stored as job artifacts for later stages.script contains a series of shell commands to fetch the
latest Squid version, export it as ENV variable, replace the placeholder
in README template, and push the updated README to the repository.This entire process is run inside a container built from the image
defined by the variable $CONTAINER_CLIENT_IMAGE.
These jobs are responsible for building AMD and ARM Docker images.
docker-hub-build:
stage: Docker-hub-build
image: docker:dind
needs:
- getsquid_vars
artifacts:
expire_in: 2 hours
paths:
- $CI_PROJECT_DIR
timeout: 3 hours
before_script:
- docker login -u "$DOCKER_HUB_USER" -p "$DOCKER_HUB_TOKEN" $DOCKER_HUB_REGISTRY
script:
- source variables.env
- docker build --build-arg SQUID_VERSION=$SQUID_VERSION --pull -t $CONTAINER_BUILD_NOPROD_NAME_AMD64 .
- docker push $CONTAINER_BUILD_NOPROD_NAME_AMD64image specifies that the job runs in a Docker-in-Docker
(DinD) environment.needs specifies the job dependencies. Here, the job
depends on getsquid_vars job.before_script includes the Docker login command, which
is required to push images to Docker Hub.script carries out a Docker build using the specific
ARG for Squid version, tags the image, and pushes to Docker Hub.These jobs are responsible for validating that the Docker images were built correctly and can run successfully.
docker-hub-test:
stage: Docker-hub-test
extends: .services-amd64
before_script:
- apt update && apt install -y curl --no-upgrade --no-install-recommends --no-install-suggests
script:
- export https_proxy=http://$CONTAINER_TEST_NAME:3128 && curl -k https://www.google.fr
variables:
HOSTNAME: squidpipeline
needs: ["docker-hub-build"]script is running a curl command to a website through
the proxy, this tests if the squid can successfully proxy HTTP
requests.extends specifies that this job will inherit the
settings from .services-amd64 which is a hidden
configuration job for spinning up the service required for this
job.These jobs are responsible for pushing Docker images to the Docker Hub registry, both for AMD and ARM architecture.
push-docker-hub:
stage: Docker-hub-pushtag
image: docker:dind
needs:
- docker-hub-test
- getsquid_vars
before_script:
- docker login -u "$DOCKER_HUB_USER" -p "$DOCKER_HUB_TOKEN" $DOCKER_HUB_REGISTRY
script:
- source variables.env
- docker pull $CONTAINER_BUILD_NOPROD_NAME_AMD64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker push $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:latest-amd64
- docker push $HUB_REGISTRY_IMAGE:latest-amd64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:latest
- docker push $HUB_REGISTRY_IMAGE:latest
variables:
GIT_STRATEGY: none
only:
- masterscript pulls the Docker image for the specific Squid
version and Docker resource, tags the image with various tags, and
pushes it to Docker Hub.The ChatGPT analysis job generates an in-depth explanation of the CI/CD pipeline and updates the analysis report on an external server.
chatgpt_analysis:
stage: Docs
image:
name: $CONTAINER_CLIENT_IMAGE
artifacts:
expire_in: 1 month
paths:
- $CI_PROJECT_DIR/chatgpt_analysis*
needs:
- getsquid_vars
- docker-hub-test
- docker-hub-test-arm
before_script:
- apt update && apt install curl git jq ca-certificates pandoc openssh-client -y --no-upgrade --no-install-recommends --no-install-suggests
- source variables.env
- SQUID_VERSION=squid-$SQUID_VERSION
script:
- JOBS_CONTENT=$(cat .gitlab-ci.yml gitlabci/*)
- LAST_COMMIT=$(git log -1 --pretty=format:"%h %s%n%b")
- CONTENT="...."
- JSON_CONTENT=$(jq -n --arg model "gpt-4" --arg content "$CONTENT" '{model:$model, messages:[{role:"user", content:$content}] }')
- RESPONSE=$(curl -X POST https://api.openai.com/v1/chat/completions -H "Authorization:Bearer $CHATGPT_API_KEY" -H "Content-Type:application/json" -d "$JSON_CONTENT")
- ANSWER=$(echo $RESPONSE | jq 'del(.choices[0].message.content)')
- RESPONSE=$(echo $RESPONSE | jq -r '.choices[0].message.content')
- echo "$ANSWER"
- echo -e "$RESPONSE" > chatgpt_analysis_$(date +%Y%m%d).md
- mkdir -p ~/.ssh
- eval $(ssh-agent -s)
- '[[ -f /.dockerenv ]] && echo -e "Host *
StrictHostKeyChecking no
" > ~/.ssh/config'
- ssh-add <(echo "$SSH_NOSTROMO_KEY")
- pandoc -s --from=markdown+smart --to=html --metadata=encoding=UTF-8 -o chatgpt_analysis_$(date +%Y%m%d).html chatgpt_analysis_$(date +%Y%m%d).md
- scp -P 822 -r chatgpt_analysis*.html e2git@e2guardian.numsys.eu:/datas/e2/html/squid-ci/
- echo "!!! See Artifact for explanations or https://e2guardian.numsys.eu !!!"
only:
- masterscript generates a comprehensive CI/CD pipeline
description using OpenAI’s GPT3 API, reformats the obtained text into
HTML using pandoc, and then transfers the HTML file to an external
server.The update_dockerhub_readme job is responsible for
updating the Docker Hub repository’s readme.
update_dockerhub_readme:
image:
name: $CONTAINER_CLIENT_IMAGE
stage: Docs
artifacts:
needs:
- getsquid_vars
before_script:
- apt update && apt install -y curl jq ca-certificates --no-upgrade --no-install-recommends --no-install-suggests
script:
- README_CONTENT=$(cat README.md)
- PAYLOAD=$(jq -n --arg desc "$README_CONTENT" '{"full_description":$desc}')
- echo "Payload JSON:$PAYLOAD"
- TOKEN=$(curl -v -s -X POST -H "Content-Type:application/json" -d '{"username":"'"$DOCKER_HUB_USER"'","password":"'"$DOCKER_HUB_PASSWORD"'"}' https://hub.docker.com/v2/users/login/ | jq -r .token)
- curl -X PATCH -H "Authorization:JWT $TOKEN" -H "Content-Type:application/json" -d "$PAYLOAD" https://hub.docker.com/v2/repositories/$HUB_REGISTRY_IMAGE
only:
- masterscript sets up the JSON payload with the contents of
README.md, gets the Docker Hub TOKEN for the user, and sends a PATCH
request to update the Docker Hub repository’s readme.Several environment variables are used throughout the pipeline:
GITLAB_TOKEN is used to authenticate git
operations.SSH_NOSTROMO_KEY is used to authenticate the SCP
transfer.DOCKER_HUB_USER and DOCKER_HUB_TOKEN are
used to authenticate Docker operations.CONTAINER_CLIENT_IMAGE is the base image used by
several jobs.HUB_REGISTRY_IMAGE refers to the Docker Hub image
repository.CONTAINER_BUILD_NOPROD_NAME_AMD64 and
CONTAINER_BUILD_NOPROD_NAME_ARM are the Docker images built
by the docker-hub-build and
docker-hub-build-arm jobs.variables.env is a file that is created and passed
between jobs. It is created by getsquid_vars and then used by other jobs
to retrieve the version of Squid that has been fetched.
File README_template.md is being used as a template for
creating updated README.md files in the getsquid_vars
job.
Some jobs in one stage depend on the completion of other jobs from
earlier stages. GitLab Pipelines manages these dependencies. For
example, the docker-hub-build job depends on the
getsquid_vars job to finish first as it needs the Squid
version fetched by it. The docker-hub-test job can’t start
until docker-hub-build has finished because it’s testing a
Docker image produced by the docker-hub-build job.
Similarly, the push-docker-hub job depends on
docker-hub-test and getsquid_vars because the
Docker image needs to be tested before it is pushed to Docker Hub and it
also needs the Squid version for proper tagging.
The getsquid_vars job outputs variables.env
file as an artifact, which is then used by other jobs to retrieve the
version of Squid that has been fetched.
The chatgpt_analysis job produces a markdown file and an
HTML file which are explanations generated by OpenAI’s GPT3 API based on
pipeline jobs. This HTML file is uploaded to an external server.
Additionally, the Docker images produced by
docker-hub-build and docker-hub-build-arm jobs
and pushed to Docker Hub by push-docker-hub and
push-docker-hub-arm jobs are the main artifacts produced by
the pipeline.
The latest commit in the Git history is “README Auto update [skip
ci]”, with the commit hash of “b4d9ee5”. This commit automatically
updates the README.md file with the latest version of Squid fetched by
the getsquid_vars job. This update is carried out by a
script within the job, which replaces placeholders in a template with
the actual Squid version and date.
The [skip ci] marker in the commit message prevents the
pipeline from running again after the commit is pushed, avoiding a
potential infinite loop of pipelines.