Files
breakpilot-compliance/.gitea/workflows/ci.yaml
Sharang Parnerkar 3320ef94fc refactor: phase 0 guardrails + phase 1 step 2 (models.py split)
Squash of branch refactor/phase0-guardrails-and-models-split — 4 commits,
81 files, 173/173 pytest green, OpenAPI contract preserved (360 paths /
484 operations).

## Phase 0 — Architecture guardrails

Three defense-in-depth layers to keep the architecture rules enforced
regardless of who opens Claude Code in this repo:

  1. .claude/settings.json PreToolUse hook on Write/Edit blocks any file
     that would exceed the 500-line hard cap. Auto-loads in every Claude
     session in this repo.
  2. scripts/githooks/pre-commit (install via scripts/install-hooks.sh)
     enforces the LOC cap locally, freezes migrations/ without
     [migration-approved], and protects guardrail files without
     [guardrail-change].
  3. .gitea/workflows/ci.yaml gains loc-budget + guardrail-integrity +
     sbom-scan (syft+grype) jobs, adds mypy --strict for the new Python
     packages (compliance/{services,repositories,domain,schemas}), and
     tsc --noEmit for admin-compliance + developer-portal.

Per-language conventions documented in AGENTS.python.md, AGENTS.go.md,
AGENTS.typescript.md at the repo root — layering, tooling, and explicit
"what you may NOT do" lists. Root CLAUDE.md is prepended with the six
non-negotiable rules. Each of the 10 services gets a README.md.

scripts/check-loc.sh enforces soft 300 / hard 500 and surfaces the
current baseline of 205 hard + 161 soft violations so Phases 1-4 can
drain it incrementally. CI gates only CHANGED files in PRs so the
legacy baseline does not block unrelated work.

## Deprecation sweep

47 files. Pydantic V1 regex= -> pattern= (2 sites), class Config ->
ConfigDict in source_policy_router.py (schemas.py intentionally skipped;
it is the Phase 1 Step 3 split target). datetime.utcnow() ->
datetime.now(timezone.utc) everywhere including SQLAlchemy default=
callables. All DB columns already declare timezone=True, so this is a
latent-bug fix at the Python side, not a schema change.

DeprecationWarning count dropped from 158 to 35.

## Phase 1 Step 1 — Contract test harness

tests/contracts/test_openapi_baseline.py diffs the live FastAPI /openapi.json
against tests/contracts/openapi.baseline.json on every test run. Fails on
removed paths, removed status codes, or new required request body fields.
Regenerate only via tests/contracts/regenerate_baseline.py after a
consumer-updated contract change. This is the safety harness for all
subsequent refactor commits.

## Phase 1 Step 2 — models.py split (1466 -> 85 LOC shim)

compliance/db/models.py is decomposed into seven sibling aggregate modules
following the existing repo pattern (dsr_models.py, vvt_models.py, ...):

  regulation_models.py       (134) — Regulation, Requirement
  control_models.py          (279) — Control, Mapping, Evidence, Risk
  ai_system_models.py        (141) — AISystem, AuditExport
  service_module_models.py   (176) — ServiceModule, ModuleRegulation, ModuleRisk
  audit_session_models.py    (177) — AuditSession, AuditSignOff
  isms_governance_models.py  (323) — ISMSScope, Context, Policy, Objective, SoA
  isms_audit_models.py       (468) — Finding, CAPA, MgmtReview, InternalAudit,
                                     AuditTrail, Readiness

models.py becomes an 85-line re-export shim in dependency order so
existing imports continue to work unchanged. Schema is byte-identical:
__tablename__, column definitions, relationship strings, back_populates,
cascade directives all preserved.

All new sibling files are under the 500-line hard cap; largest is
isms_audit_models.py at 468. No file in compliance/db/ now exceeds
the hard cap.

## Phase 1 Step 3 — infrastructure only

backend-compliance/compliance/{schemas,domain,repositories}/ packages
are created as landing zones with docstrings. compliance/domain/
exports DomainError / NotFoundError / ConflictError / ValidationError /
PermissionError — the base classes services will use to raise
domain-level errors instead of HTTPException.

PHASE1_RUNBOOK.md at backend-compliance/PHASE1_RUNBOOK.md documents
the nine-step execution plan for Phase 1: snapshot baseline,
characterization tests, split models.py (this commit), split schemas.py
(next), extract services, extract repositories, mypy --strict, coverage.

## Verification

  backend-compliance/.venv-phase1: uv python install 3.12 + pip -r requirements.txt
  PYTHONPATH=. pytest compliance/tests/ tests/contracts/
  -> 173 passed, 0 failed, 35 warnings, OpenAPI 360/484 unchanged

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 13:18:29 +02:00

305 lines
11 KiB
YAML

# Gitea Actions CI/CD Pipeline
# BreakPilot Compliance
#
# Services:
# Go: ai-compliance-sdk
# Python: backend-compliance, document-crawler, dsms-gateway
# Node.js: admin-compliance, developer-portal
#
# Workflow:
# Push auf main → Tests → Deploy (Coolify)
# Pull Request → Lint + Tests (kein Deploy)
name: CI/CD
on:
push:
branches: [main, develop]
pull_request:
branches: [main, develop]
jobs:
# ========================================
# Guardrails — LOC budget + architecture gates
# Runs on every push/PR. Fails fast and cheap.
# ========================================
loc-budget:
runs-on: docker
container: alpine:3.20
steps:
- name: Checkout
run: |
apk add --no-cache git bash
git clone --depth 50 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Enforce 500-line hard cap on changed files
run: |
chmod +x scripts/check-loc.sh
if [ "${GITHUB_EVENT_NAME}" = "pull_request" ]; then
git fetch origin ${GITHUB_BASE_REF}:base
mapfile -t changed < <(git diff --name-only --diff-filter=ACM base...HEAD)
[ ${#changed[@]} -eq 0 ] && { echo "No changed files."; exit 0; }
scripts/check-loc.sh "${changed[@]}"
else
# Push to main: only warn on whole-repo state; blocking gate is on PRs.
scripts/check-loc.sh || true
fi
# Phase 0 intentionally gates only changed files so the 205-file legacy
# baseline doesn't block every PR. Phases 1-4 drain the baseline; Phase 5
# flips this to a whole-repo blocking gate.
guardrail-integrity:
runs-on: docker
container: alpine:3.20
if: github.event_name == 'pull_request'
steps:
- name: Checkout
run: |
apk add --no-cache git bash
git clone --depth 20 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
git fetch origin ${GITHUB_BASE_REF}:base
- name: Require [guardrail-change] label in PR commits touching guardrails
run: |
changed=$(git diff --name-only base...HEAD)
echo "$changed" | grep -E '^(\.claude/settings\.json|\.claude/rules/loc-exceptions\.txt|scripts/check-loc\.sh|scripts/githooks/pre-commit|AGENTS\.(python|go|typescript)\.md)$' || exit 0
if ! git log base..HEAD --format=%B | grep -q '\[guardrail-change\]'; then
echo "::error:: Guardrail files were modified but no commit in this PR carries [guardrail-change]."
echo "If intentional, amend one commit message with [guardrail-change] and explain why in the body."
exit 1
fi
# ========================================
# Lint (nur bei PRs)
# ========================================
go-lint:
runs-on: docker
if: github.event_name == 'pull_request'
container: golangci/golangci-lint:v1.62-alpine
steps:
- name: Checkout
run: |
apk add --no-cache git
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Lint ai-compliance-sdk
run: |
if [ -d "ai-compliance-sdk" ]; then
cd ai-compliance-sdk && golangci-lint run --timeout 5m ./...
fi
python-lint:
runs-on: docker
if: github.event_name == 'pull_request'
container: python:3.12-slim
steps:
- name: Checkout
run: |
apt-get update -qq && apt-get install -y -qq git > /dev/null 2>&1
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Lint Python services (ruff)
run: |
pip install --quiet ruff
fail=0
for svc in backend-compliance document-crawler dsms-gateway compliance-tts-service; do
if [ -d "$svc" ]; then
echo "=== ruff: $svc ==="
ruff check "$svc/" --output-format=github || fail=1
fi
done
exit $fail
- name: Type-check new modules (mypy --strict)
# Scoped to the layered packages we own. Expand this list as Phase 1+ refactors land.
run: |
pip install --quiet mypy
for pkg in \
backend-compliance/compliance/services \
backend-compliance/compliance/repositories \
backend-compliance/compliance/domain \
backend-compliance/compliance/schemas; do
if [ -d "$pkg" ]; then
echo "=== mypy --strict: $pkg ==="
mypy --strict --ignore-missing-imports "$pkg" || exit 1
fi
done
nodejs-lint:
runs-on: docker
if: github.event_name == 'pull_request'
container: node:20-alpine
steps:
- name: Checkout
run: |
apk add --no-cache git
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Lint + type-check Node.js services
run: |
fail=0
for svc in admin-compliance developer-portal; do
if [ -d "$svc" ]; then
echo "=== $svc: install ==="
(cd "$svc" && (npm ci --silent 2>/dev/null || npm install --silent))
echo "=== $svc: next lint ==="
(cd "$svc" && npx next lint) || fail=1
echo "=== $svc: tsc --noEmit ==="
(cd "$svc" && npx tsc --noEmit) || fail=1
fi
done
exit $fail
# ========================================
# Unit Tests
# ========================================
test-go-ai-compliance:
runs-on: docker
container: golang:1.24-alpine
env:
CGO_ENABLED: "0"
steps:
- name: Checkout
run: |
apk add --no-cache git
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Test ai-compliance-sdk
run: |
if [ ! -d "ai-compliance-sdk" ]; then
echo "WARNUNG: ai-compliance-sdk nicht gefunden"
exit 0
fi
cd ai-compliance-sdk
go test -v -coverprofile=coverage.out ./... 2>&1
COVERAGE=$(go tool cover -func=coverage.out 2>/dev/null | tail -1 | awk '{print $3}' || echo "0%")
echo "Coverage: $COVERAGE"
test-python-backend-compliance:
runs-on: docker
container: python:3.12-slim
env:
CI: "true"
steps:
- name: Checkout
run: |
apt-get update -qq && apt-get install -y -qq git > /dev/null 2>&1
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Test backend-compliance
run: |
if [ ! -d "backend-compliance" ]; then
echo "WARNUNG: backend-compliance nicht gefunden"
exit 0
fi
cd backend-compliance
export PYTHONPATH="$(pwd):${PYTHONPATH:-}"
pip install --quiet --no-cache-dir -r requirements.txt 2>/dev/null || true
pip install --quiet --no-cache-dir fastapi uvicorn pytest pytest-asyncio
python -m pytest compliance/tests/ -v --tb=short
test-python-document-crawler:
runs-on: docker
container: python:3.12-slim
env:
CI: "true"
steps:
- name: Checkout
run: |
apt-get update -qq && apt-get install -y -qq git > /dev/null 2>&1
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Test document-crawler
run: |
if [ ! -d "document-crawler" ]; then
echo "WARNUNG: document-crawler nicht gefunden"
exit 0
fi
cd document-crawler
export PYTHONPATH="$(pwd):${PYTHONPATH:-}"
pip install --quiet --no-cache-dir -r requirements.txt 2>/dev/null || true
pip install --quiet --no-cache-dir pytest pytest-asyncio
python -m pytest tests/ -v --tb=short
test-python-dsms-gateway:
runs-on: docker
container: python:3.12-slim
env:
CI: "true"
steps:
- name: Checkout
run: |
apt-get update -qq && apt-get install -y -qq git > /dev/null 2>&1
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Test dsms-gateway
run: |
if [ ! -d "dsms-gateway" ]; then
echo "WARNUNG: dsms-gateway nicht gefunden"
exit 0
fi
cd dsms-gateway
export PYTHONPATH="$(pwd):${PYTHONPATH:-}"
pip install --quiet --no-cache-dir -r requirements.txt 2>/dev/null || true
pip install --quiet --no-cache-dir pytest pytest-asyncio
python -m pytest test_main.py -v --tb=short
# ========================================
# SBOM + license scan (compliance product → we eat our own dog food)
# ========================================
sbom-scan:
runs-on: docker
if: github.event_name == 'pull_request'
container: alpine:3.20
steps:
- name: Checkout
run: |
apk add --no-cache git curl bash
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Install syft + grype
run: |
curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s -- -b /usr/local/bin
curl -sSfL https://raw.githubusercontent.com/anchore/grype/main/install.sh | sh -s -- -b /usr/local/bin
- name: Generate SBOM
run: |
mkdir -p sbom-out
syft dir:. -o cyclonedx-json=sbom-out/sbom.cdx.json -q
- name: Vulnerability scan (fail on high+)
run: |
grype sbom:sbom-out/sbom.cdx.json --fail-on high -q || true
# Initially non-blocking ('|| true'). Flip to blocking after baseline is clean.
# ========================================
# Validate Canonical Controls
# ========================================
validate-canonical-controls:
runs-on: docker
container: python:3.12-slim
steps:
- name: Checkout
run: |
apt-get update -qq && apt-get install -y -qq git > /dev/null 2>&1
git clone --depth 1 --branch ${GITHUB_REF_NAME} ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git .
- name: Validate controls
run: |
python scripts/validate-controls.py
# ========================================
# Deploy via Coolify (nur main, kein PR)
# ========================================
deploy-coolify:
name: Deploy
runs-on: docker
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
needs:
- loc-budget
- test-go-ai-compliance
- test-python-backend-compliance
- test-python-document-crawler
- test-python-dsms-gateway
- validate-canonical-controls
container:
image: alpine:latest
steps:
- name: Trigger Coolify deploy
run: |
apk add --no-cache curl
curl -sf "${{ secrets.COOLIFY_WEBHOOK }}" \
-H "Authorization: Bearer ${{ secrets.COOLIFY_TOKEN }}"