Building Python packages via GitLab pipeline with type checking
My main programming language is Python and I’m a huge fan of static type hinting. As of 2026 there are four type checkers:
I like to integrate them into my GitLab workflow, which includes generatin a Code Quality report.
On top of this my pipeline also runs Astral ruff as a linter and code formatter and uses Astral uv to build and publish the Python package to GitLabs’s PyPI package repository.
The complete example is available on gitlab.com.
Basic GitLab pipeline
workflow:
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event" # MR
- if: $CI_COMMIT_BRANCH && $CI_OPEN_MERGE_REQUESTS
when: never
- if: $CI_COMMIT_BRANCH # Branch,Schedule,Web,CLI
- if: $CI_COMMIT_TAG # Tag
My workflow required pipelines for merge-requests, stable/default/protected branches, tags and manually and time triggered pipelines.
stages:
- lint
- build
- publish
- release
default:
interruptible: true
artifacts:
expire_in: 1 day
variables:
FF_SCRIPT_SECTIONS: true
FF_TIMESTAMPS: true
FF_USE_NEW_BASH_EVAL_STRATEGY: true
.py:
rules:
- changes:
paths:
- pyproject.toml
- uv.lock
- "**/*.py"
variables:
GIT_DEPTH: 1
I set several GitLab Runner feature flags to get some better experience.
Prepare uv
.uv:
variables:
UV_VERSION: "0.11"
PYTHON_VERSION: "3.13"
BASE_LAYER: trixie
# GitLab CI creates a separate mountpoint for the build directory,
# so we need to copy instead of using hard links.
UV_LINK_MODE: copy
UV_CACHE_DIR: .uv-cache
image: ghcr.io/astral-sh/uv:$UV_VERSION-python$PYTHON_VERSION-$BASE_LAYER
cache:
- key:
files:
- uv.lock
paths:
- $UV_CACHE_DIR
after_script:
- uv cache prune --ci
I’m using uv here with some specific version matching Debian 13 “Trixie”.
This sets up caching as documented in uv’s GitLab integration.
Running ruff
.ruff:
extends: [.uv, .py]
stage: lint
ruff check:
extends: [.ruff]
script:
- uvx ruff check --output-format=gitlab --output-file=code-quality-report.json
artifacts:
reports:
codequality: $CI_PROJECT_DIR/code-quality-report.json
ruff format:
extends: [.ruff]
script:
- uvx ruff format --diff
This runs ruff twice:
- Once as a linter to check form common issues
- Once again as a code formatter to check, if the formatting does not follow the configured style.
Running the type checkers
.lint:
stage: lint
extends: [.uv, .py]
artifacts:
reports:
codequality: $CI_PROJECT_DIR/gl-code-quality-report.json
mypy:
extends: [.lint]
script:
- uvx mypy --no-error-summary >mypy-out.txt
after_script:
- uvx mypy-gitlab-code-quality <mypy-out.txt >gl-code-quality-report.json
- !reference [.uv, after_script]
pyright:
extends: [.lint]
script:
- uvx --from=pyright[nodejs] pyright --outputjson >pyright-raw.json
after_script:
- uvx pyright-to-gitlab -i pyright-raw.json -o gl-code-quality-report.json
- !reference [.uv, after_script]
pyrefly:
extends: [.uv, .py] # .lint
stage: lint # TEMPORARY
script:
- uvx pyrefly check # --output-format CodeQuality --output gl-code-quality-report.json
ty:
extends: [.lint]
script:
- uvx ty check --output-format gitlab >gl-code-quality-report.json
mypy and pyright do not generate the Code Quality JSON themselves.
They require running some converter, which transforms their output format.
This is done in after_script to always run them, even when the type checkers abort with an exit code other than 0.
This overwrites the after_script from the template job .uv, which calls uv cache prune --ci to maintain its cache.
As such we have to restore that functionality and use !reference to do that.
ty already generated the required JSON.
For pyrefly there is issue 3049, where I asked to add native support for it.
Build and publish the Python package
build python package:
stage: build
extends: [.uv]
rules:
- if: $CI_COMMIT_BRANCH
- if: $CI_COMMIT_TAG
variables:
GIT_DEPTH: 0
GIT_FETCH_EXTRA_FLAGS: --prune --quiet --tags --filter=tree:0
script:
# - |
# VERSION=$(git describe --exact-match --tags) &&
# uvx --from=toml-cli toml set --toml-path=pyproject.toml project.version "$VERSION"
- uv build
artifacts:
paths:
- dist/
This builds the Python package.
Alter uv publish will fail when you try to upload a Python package with an already existing version.
This happens mostly because I forget to update [project]version in the pyproject.toml.
The out-commented code above would poke the version into the file before each build.
My alternative was to switch to setuptools-scm:
This will use git desctibe to generate the version from git tags, which requires two things:
- The image must contain the
gitbinary. Debian’s-slim-images do not. Switch to the non--slimversions. - Fetch enough history:
GIT_DEPTH: 1may not be enough to walk the git commits fromHEADto any previousgit tag. Therefor I useGIT_DEPTH: 0to fetch the complete history, but combine it withGIT_FETCH_EXTRA_FLAGS: --filter=tree:0: That way I usegit’s partial-clone feature: It fetch all commit object, but only the tree and blob objects required for HEAD.
publish python package:
stage: publish
extends: [.uv]
rules:
- if: $CI_COMMIT_TAG
needs:
- job: build python package
variables:
GIT_STRATEGY: none
UV_PUBLISH_URL: ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi
UV_PUBLISH_USERNAME: gitlab-ci-token
UV_PUBLISH_PASSWORD: ${CI_JOB_TOKEN}
script:
- uv publish dist/*.whl
This published the Python package to GitLab’s PyPI package registry.
Creating a GitLab release
release_job:
stage: release
image: registry.gitlab.com/gitlab-org/cli:latest
rules:
- if: '$CI_COMMIT_TAG =~ /^v?\d+\.\d+\.\d+$/'
variables:
GIT_STRATEGY: none
GLAB_CONFIG_DIR: ${CI_PROJECT_DIR}/.glab-config.${CI_PIPELINE_ID}
GLAB_ENABLE_CI_AUTOLOGIN: true
GITLAB_HOST: $CI_SERVER_URL
dependencies: []
script:
- >
glab changelog generate >changelog.md
--repo "$CI_PROJECT_PATH"
--version "$CI_COMMIT_TAG"
--to "$CI_COMMIT_BRANCH"
release:
name: 'Release $CI_COMMIT_TAG'
description: changelog.md
tag_name: $CI_COMMIT_TAG
assets:
links:
- name: 'PyPi package $CI_COMMIT_TAG'
url: $CI_PROJECT_URL/-/packages/
link_type: package
The final part creates a GitLab release.
It uses GitLab’s changelog API to automatically create a changelog in Markdown format from the git commits having a Changelog: trailer.
For my environment I have to tell glab to use configuration file in a writeable directory.
Without that it will try to write to /.glab/, which will fail.
That assets:links: part creates a link to the PyPI package registry.
GitLab 18.11 just received a feature, where packages are included as release evidence, which might make this optional.