refactor: phase 0 guardrails + phase 1 step 2 (models.py split)
Squash of branch refactor/phase0-guardrails-and-models-split — 4 commits,
81 files, 173/173 pytest green, OpenAPI contract preserved (360 paths /
484 operations).
## Phase 0 — Architecture guardrails
Three defense-in-depth layers to keep the architecture rules enforced
regardless of who opens Claude Code in this repo:
1. .claude/settings.json PreToolUse hook on Write/Edit blocks any file
that would exceed the 500-line hard cap. Auto-loads in every Claude
session in this repo.
2. scripts/githooks/pre-commit (install via scripts/install-hooks.sh)
enforces the LOC cap locally, freezes migrations/ without
[migration-approved], and protects guardrail files without
[guardrail-change].
3. .gitea/workflows/ci.yaml gains loc-budget + guardrail-integrity +
sbom-scan (syft+grype) jobs, adds mypy --strict for the new Python
packages (compliance/{services,repositories,domain,schemas}), and
tsc --noEmit for admin-compliance + developer-portal.
Per-language conventions documented in AGENTS.python.md, AGENTS.go.md,
AGENTS.typescript.md at the repo root — layering, tooling, and explicit
"what you may NOT do" lists. Root CLAUDE.md is prepended with the six
non-negotiable rules. Each of the 10 services gets a README.md.
scripts/check-loc.sh enforces soft 300 / hard 500 and surfaces the
current baseline of 205 hard + 161 soft violations so Phases 1-4 can
drain it incrementally. CI gates only CHANGED files in PRs so the
legacy baseline does not block unrelated work.
## Deprecation sweep
47 files. Pydantic V1 regex= -> pattern= (2 sites), class Config ->
ConfigDict in source_policy_router.py (schemas.py intentionally skipped;
it is the Phase 1 Step 3 split target). datetime.utcnow() ->
datetime.now(timezone.utc) everywhere including SQLAlchemy default=
callables. All DB columns already declare timezone=True, so this is a
latent-bug fix at the Python side, not a schema change.
DeprecationWarning count dropped from 158 to 35.
## Phase 1 Step 1 — Contract test harness
tests/contracts/test_openapi_baseline.py diffs the live FastAPI /openapi.json
against tests/contracts/openapi.baseline.json on every test run. Fails on
removed paths, removed status codes, or new required request body fields.
Regenerate only via tests/contracts/regenerate_baseline.py after a
consumer-updated contract change. This is the safety harness for all
subsequent refactor commits.
## Phase 1 Step 2 — models.py split (1466 -> 85 LOC shim)
compliance/db/models.py is decomposed into seven sibling aggregate modules
following the existing repo pattern (dsr_models.py, vvt_models.py, ...):
regulation_models.py (134) — Regulation, Requirement
control_models.py (279) — Control, Mapping, Evidence, Risk
ai_system_models.py (141) — AISystem, AuditExport
service_module_models.py (176) — ServiceModule, ModuleRegulation, ModuleRisk
audit_session_models.py (177) — AuditSession, AuditSignOff
isms_governance_models.py (323) — ISMSScope, Context, Policy, Objective, SoA
isms_audit_models.py (468) — Finding, CAPA, MgmtReview, InternalAudit,
AuditTrail, Readiness
models.py becomes an 85-line re-export shim in dependency order so
existing imports continue to work unchanged. Schema is byte-identical:
__tablename__, column definitions, relationship strings, back_populates,
cascade directives all preserved.
All new sibling files are under the 500-line hard cap; largest is
isms_audit_models.py at 468. No file in compliance/db/ now exceeds
the hard cap.
## Phase 1 Step 3 — infrastructure only
backend-compliance/compliance/{schemas,domain,repositories}/ packages
are created as landing zones with docstrings. compliance/domain/
exports DomainError / NotFoundError / ConflictError / ValidationError /
PermissionError — the base classes services will use to raise
domain-level errors instead of HTTPException.
PHASE1_RUNBOOK.md at backend-compliance/PHASE1_RUNBOOK.md documents
the nine-step execution plan for Phase 1: snapshot baseline,
characterization tests, split models.py (this commit), split schemas.py
(next), extract services, extract repositories, mypy --strict, coverage.
## Verification
backend-compliance/.venv-phase1: uv python install 3.12 + pip -r requirements.txt
PYTHONPATH=. pytest compliance/tests/ tests/contracts/
-> 173 passed, 0 failed, 35 warnings, OpenAPI 360/484 unchanged
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -7,7 +7,7 @@ Native Python/FastAPI Implementierung, ersetzt Go consent-service Proxy.
|
||||
import io
|
||||
import csv
|
||||
import uuid
|
||||
from datetime import datetime, timedelta
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Optional, List, Dict, Any
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Query, Header
|
||||
@@ -168,7 +168,7 @@ def _get_tenant(x_tenant_id: Optional[str] = Header(None, alias='X-Tenant-ID'))
|
||||
|
||||
def _generate_request_number(db: Session, tenant_id: str) -> str:
|
||||
"""Generate next request number: DSR-YYYY-NNNNNN"""
|
||||
year = datetime.utcnow().year
|
||||
year = datetime.now(timezone.utc).year
|
||||
try:
|
||||
result = db.execute(text("SELECT nextval('compliance_dsr_request_number_seq')"))
|
||||
seq = result.scalar()
|
||||
@@ -275,7 +275,7 @@ async def create_dsr(
|
||||
if body.priority and body.priority not in VALID_PRIORITIES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid priority. Must be one of: {VALID_PRIORITIES}")
|
||||
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
deadline_days = DEADLINE_DAYS.get(body.request_type, 30)
|
||||
request_number = _generate_request_number(db, tenant_id)
|
||||
|
||||
@@ -348,7 +348,7 @@ async def list_dsrs(
|
||||
query = query.filter(DSRRequestDB.priority == priority)
|
||||
if overdue_only:
|
||||
query = query.filter(
|
||||
DSRRequestDB.deadline_at < datetime.utcnow(),
|
||||
DSRRequestDB.deadline_at < datetime.now(timezone.utc),
|
||||
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
|
||||
)
|
||||
if search:
|
||||
@@ -399,7 +399,7 @@ async def get_dsr_stats(
|
||||
by_type[t] = base.filter(DSRRequestDB.request_type == t).count()
|
||||
|
||||
# Overdue
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
overdue = base.filter(
|
||||
DSRRequestDB.deadline_at < now,
|
||||
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
|
||||
@@ -459,7 +459,7 @@ async def export_dsrs(
|
||||
|
||||
if format == "json":
|
||||
return {
|
||||
"exported_at": datetime.utcnow().isoformat(),
|
||||
"exported_at": datetime.now(timezone.utc).isoformat(),
|
||||
"total": len(dsrs),
|
||||
"requests": [_dsr_to_dict(d) for d in dsrs],
|
||||
}
|
||||
@@ -506,7 +506,7 @@ async def process_deadlines(
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Verarbeitet Fristen und markiert ueberfaellige DSRs."""
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
tid = uuid.UUID(tenant_id)
|
||||
|
||||
overdue = db.query(DSRRequestDB).filter(
|
||||
@@ -714,7 +714,7 @@ async def publish_template_version(
|
||||
if not version:
|
||||
raise HTTPException(status_code=404, detail="Version not found")
|
||||
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
version.status = "published"
|
||||
version.published_at = now
|
||||
version.published_by = "admin"
|
||||
@@ -766,7 +766,7 @@ async def update_dsr(
|
||||
dsr.internal_notes = body.internal_notes
|
||||
if body.assigned_to is not None:
|
||||
dsr.assigned_to = body.assigned_to
|
||||
dsr.assigned_at = datetime.utcnow()
|
||||
dsr.assigned_at = datetime.now(timezone.utc)
|
||||
if body.request_text is not None:
|
||||
dsr.request_text = body.request_text
|
||||
if body.affected_systems is not None:
|
||||
@@ -778,7 +778,7 @@ async def update_dsr(
|
||||
if body.objection_details is not None:
|
||||
dsr.objection_details = body.objection_details
|
||||
|
||||
dsr.updated_at = datetime.utcnow()
|
||||
dsr.updated_at = datetime.now(timezone.utc)
|
||||
db.commit()
|
||||
db.refresh(dsr)
|
||||
return _dsr_to_dict(dsr)
|
||||
@@ -797,7 +797,7 @@ async def delete_dsr(
|
||||
|
||||
_record_history(db, dsr, "cancelled", comment="DSR storniert")
|
||||
dsr.status = "cancelled"
|
||||
dsr.updated_at = datetime.utcnow()
|
||||
dsr.updated_at = datetime.now(timezone.utc)
|
||||
db.commit()
|
||||
return {"success": True, "message": "DSR cancelled"}
|
||||
|
||||
@@ -820,7 +820,7 @@ async def change_status(
|
||||
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
||||
_record_history(db, dsr, body.status, comment=body.comment)
|
||||
dsr.status = body.status
|
||||
dsr.updated_at = datetime.utcnow()
|
||||
dsr.updated_at = datetime.now(timezone.utc)
|
||||
db.commit()
|
||||
db.refresh(dsr)
|
||||
return _dsr_to_dict(dsr)
|
||||
@@ -835,7 +835,7 @@ async def verify_identity(
|
||||
):
|
||||
"""Verifiziert die Identitaet des Antragstellers."""
|
||||
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
dsr.identity_verified = True
|
||||
dsr.verification_method = body.method
|
||||
@@ -868,9 +868,9 @@ async def assign_dsr(
|
||||
"""Weist eine DSR einem Bearbeiter zu."""
|
||||
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
||||
dsr.assigned_to = body.assignee_id
|
||||
dsr.assigned_at = datetime.utcnow()
|
||||
dsr.assigned_at = datetime.now(timezone.utc)
|
||||
dsr.assigned_by = "admin"
|
||||
dsr.updated_at = datetime.utcnow()
|
||||
dsr.updated_at = datetime.now(timezone.utc)
|
||||
db.commit()
|
||||
db.refresh(dsr)
|
||||
return _dsr_to_dict(dsr)
|
||||
@@ -888,7 +888,7 @@ async def extend_deadline(
|
||||
if dsr.status in ("completed", "rejected", "cancelled"):
|
||||
raise HTTPException(status_code=400, detail="Cannot extend deadline for closed DSR")
|
||||
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
current_deadline = dsr.extended_deadline_at or dsr.deadline_at
|
||||
new_deadline = current_deadline + timedelta(days=body.days or 60)
|
||||
|
||||
@@ -916,7 +916,7 @@ async def complete_dsr(
|
||||
if dsr.status in ("completed", "cancelled"):
|
||||
raise HTTPException(status_code=400, detail="DSR already completed or cancelled")
|
||||
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
_record_history(db, dsr, "completed", comment=body.summary)
|
||||
dsr.status = "completed"
|
||||
dsr.completed_at = now
|
||||
@@ -941,7 +941,7 @@ async def reject_dsr(
|
||||
if dsr.status in ("completed", "rejected", "cancelled"):
|
||||
raise HTTPException(status_code=400, detail="DSR already closed")
|
||||
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
_record_history(db, dsr, "rejected", comment=f"{body.reason} ({body.legal_basis})")
|
||||
dsr.status = "rejected"
|
||||
dsr.rejection_reason = body.reason
|
||||
@@ -1024,7 +1024,7 @@ async def send_communication(
|
||||
):
|
||||
"""Sendet eine Kommunikation."""
|
||||
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
||||
now = datetime.utcnow()
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
comm = DSRCommunicationDB(
|
||||
tenant_id=uuid.UUID(tenant_id),
|
||||
@@ -1158,7 +1158,7 @@ async def update_exception_check(
|
||||
check.applies = body.applies
|
||||
check.notes = body.notes
|
||||
check.checked_by = "admin"
|
||||
check.checked_at = datetime.utcnow()
|
||||
check.checked_at = datetime.now(timezone.utc)
|
||||
db.commit()
|
||||
db.refresh(check)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user