The .gitlab-ci.yml file consists of various jobs that
organize the instructions for GitLab CI/CD. These jobs are structured
within multiple stages, namely: Quality, Get-version, Docker-hub-build,
Docker-hub-test, Docker-hub-pushtag, Docker-hub-build-arm,
Docker-hub-test-arm, Docker-hub-pushtag-arm, test, and Docs.
hadolint stage, entails linting the Dockerfile using
Hadolint.getsquid_vars job fetches the latest version of Squid from
GitHub and stores it in an environment variable.docker-hub-build and docker-hub-build-arm jobs
build the Docker image for both AMD64 and ARM architectures,
respectively.docker-hub-test and docker-hub-test-arm) test
the built Docker images.push-docker-hub and push-docker-hub-arm) push
the Docker images to Docker Hub with the appropriate tags.SquidParseConfig and dive-arm validate the
content of the Docker images.update_dockerhub_readme and chatgpt_analysis)
focus on the documentation. For instance, they update the README on
Docker Hub and generate an in-depth explanation of the pipeline using
GitLab CI/CD jobs.Let’s go in detail step by step with each job.
Purpose: This job analyzes and corrects the Dockerfile using Hadolint which is a Dockerfile linter.
script:
- hadolint --ignore DL3008 Dockerfile hadolint --ignore DL3008 Dockerfile - This command is
running the linter on the Dockerfile. It ignores the DL3008 rule: Pin
versions in apt-get install.
Purpose: This job fetches the latest version of Squid from GitHub.
script:
- apt update && apt install git curl ca-certificates -y --no-upgrade --no-install-recommends --no-install-suggests
- export SQUID_VERSION=$(curl -LsXGET https://github.com/squid-cache/squid/releases/latest | grep -m 1 "Release" | cut -d " " -f4 |tr -d 'v')
- echo "SQUID_VERSION=$SQUID_VERSION" > variables.envThe script starts by updating the package list using
apt update and then installs git,
curl, and ca-certificates to fetch the Squid
version from GitHub. The Squid version is extracted from the latest
release page, using a series of commands to filter and format the
output. The version number is then stored in the
variables.env file.
Purpose: This job builds the Docker image for AMD64 architecture.
script:
- source variables.env
- docker build --build-arg SQUID_VERSION=$SQUID_VERSION --pull -t $CONTAINER_BUILD_NOPROD_NAME_AMD64 .
- docker push $CONTAINER_BUILD_NOPROD_NAME_AMD64The docker build command builds the Docker image using
the Dockerfile present in the current directory. It uses the
--build-arg option to pass the version number to Dockerfile
and the -t option to tag the built image. The created image
is then pushed to Docker Hub with tags using
docker push.
Purpose: This job builds the Docker image for ARM
architecture. The script is similar to
docker-hub-build.
Purpose: This job tests the built Docker image for AMD64 architecture.
script:
- apt update && apt install -y curl --no-upgrade --no-install-recommends --no-install-suggests
- export https_proxy=http://$CONTAINER_TEST_NAME:3128 && curl -k https://www.google.frThis script installs curl and then uses curl to connect to a webpage via the test container to verify that squid-proxy is working.
Purpose: This job tests the built Docker image for
ARM architecture. The command is the same as
docker-hub-test.
Purpose: This job pushes the Docker image for AMD64 architecture to Docker Hub.
script:
- source variables.env
- docker pull $CONTAINER_BUILD_NOPROD_NAME_AMD64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker push $HUB_REGISTRY_IMAGE:$SQUID_VERSION-amd64
- docker tag $CONTAINER_BUILD_NOPROD_NAME_AMD64 $HUB_REGISTRY_IMAGE:latest-amd64
- docker push $HUB_REGISTRY_IMAGE:latest-amd64The script starts by pulling the image from Docker Hub. It then tags
the image with the version number and pushes it back to Docker Hub.
Finally, it tags the image as the latest version and pushes
it to Docker Hub.
Purpose: This job pushes the Docker image for ARM
architecture to Docker Hub. The script is the same as
push-docker-hub.
Purpose: This job updates the repo’s README file on Docker Hub.
script:
- README_CONTENT=$(cat README.md)
- PAYLOAD=$(jq -n --arg desc "$README_CONTENT" '{"full_description":$desc}')
- TOKEN=$(curl -v -s -X POST -H "Content-Type:application/json" -d '{"username":"'"$DOCKER_HUB_USER"'","password":"'"$DOCKER_HUB_PASSWORD"'"}' https://hub.docker.com/v2/users/login/ | jq -r .token)
- curl -X PATCH -H "Authorization:JWT $TOKEN" -H "Content-Type:application/json" -d "$PAYLOAD" https://hub.docker.com/v2/repositories/$HUB_REGISTRY_IMAGEThe script creates a README_CONTENT that contains the
README.md content and creates a JSON PAYLOAD with this
content. Then, it authenticates with Docker Hub to create an access
token. Finally, it PATCHs the full description of the Docker Hub repo
with the JSON payload.
Purpose: This job generates an in-depth analysis of the pipeline using ChatGPT.
The pipeline uses a range of variables declared in the
variables section. These include environment variables like
CONTAINER_CLIENT_IMAGE,
CONTAINER_BUILD_NOPROD_NAME_ARM,
CONTAINER_BUILD_NOPROD_NAME_AMD64,
CONTAINER_TEST_NAME, DOCKER_HUB_USER,
DOCKER_HUB_TOKEN, HUB_REGISTRY_IMAGE, etc. It
also makes use of the .env file that is created by the
getsquid_vars job.
Jobs within the pipeline are dependencies for the following jobs. For
instance, getsquid_vars fetches the Squid version which is
needed for docker-hub-build and
docker-hub-build-arm to build the Docker image. Same with
the test job it depends on the build job and then the push job depends
on test job etc.
Typically, the output of a job would be logs that show execution
status and any compiled or built artifacts. In this case, artifacts such
as Docker images are pushed to Docker Hub. In addition, the
getsquid_vars job outputs a variables.env file
which is used by subsequent jobs. The chatgpt_analysis job
generates an analysis report which is sent to a remote server via
SCP.
The latest commit updates the README file automatically with the
current date and the latest Squid version. It then commits this updated
README to the GitLab repo and pushes it, but skips the CI pipeline
during push by using [skip ci] in the commit message.