Files
breakpilot-compliance/backend-compliance/compliance/api/crud_factory.py
Sharang Parnerkar 3320ef94fc refactor: phase 0 guardrails + phase 1 step 2 (models.py split)
Squash of branch refactor/phase0-guardrails-and-models-split — 4 commits,
81 files, 173/173 pytest green, OpenAPI contract preserved (360 paths /
484 operations).

## Phase 0 — Architecture guardrails

Three defense-in-depth layers to keep the architecture rules enforced
regardless of who opens Claude Code in this repo:

  1. .claude/settings.json PreToolUse hook on Write/Edit blocks any file
     that would exceed the 500-line hard cap. Auto-loads in every Claude
     session in this repo.
  2. scripts/githooks/pre-commit (install via scripts/install-hooks.sh)
     enforces the LOC cap locally, freezes migrations/ without
     [migration-approved], and protects guardrail files without
     [guardrail-change].
  3. .gitea/workflows/ci.yaml gains loc-budget + guardrail-integrity +
     sbom-scan (syft+grype) jobs, adds mypy --strict for the new Python
     packages (compliance/{services,repositories,domain,schemas}), and
     tsc --noEmit for admin-compliance + developer-portal.

Per-language conventions documented in AGENTS.python.md, AGENTS.go.md,
AGENTS.typescript.md at the repo root — layering, tooling, and explicit
"what you may NOT do" lists. Root CLAUDE.md is prepended with the six
non-negotiable rules. Each of the 10 services gets a README.md.

scripts/check-loc.sh enforces soft 300 / hard 500 and surfaces the
current baseline of 205 hard + 161 soft violations so Phases 1-4 can
drain it incrementally. CI gates only CHANGED files in PRs so the
legacy baseline does not block unrelated work.

## Deprecation sweep

47 files. Pydantic V1 regex= -> pattern= (2 sites), class Config ->
ConfigDict in source_policy_router.py (schemas.py intentionally skipped;
it is the Phase 1 Step 3 split target). datetime.utcnow() ->
datetime.now(timezone.utc) everywhere including SQLAlchemy default=
callables. All DB columns already declare timezone=True, so this is a
latent-bug fix at the Python side, not a schema change.

DeprecationWarning count dropped from 158 to 35.

## Phase 1 Step 1 — Contract test harness

tests/contracts/test_openapi_baseline.py diffs the live FastAPI /openapi.json
against tests/contracts/openapi.baseline.json on every test run. Fails on
removed paths, removed status codes, or new required request body fields.
Regenerate only via tests/contracts/regenerate_baseline.py after a
consumer-updated contract change. This is the safety harness for all
subsequent refactor commits.

## Phase 1 Step 2 — models.py split (1466 -> 85 LOC shim)

compliance/db/models.py is decomposed into seven sibling aggregate modules
following the existing repo pattern (dsr_models.py, vvt_models.py, ...):

  regulation_models.py       (134) — Regulation, Requirement
  control_models.py          (279) — Control, Mapping, Evidence, Risk
  ai_system_models.py        (141) — AISystem, AuditExport
  service_module_models.py   (176) — ServiceModule, ModuleRegulation, ModuleRisk
  audit_session_models.py    (177) — AuditSession, AuditSignOff
  isms_governance_models.py  (323) — ISMSScope, Context, Policy, Objective, SoA
  isms_audit_models.py       (468) — Finding, CAPA, MgmtReview, InternalAudit,
                                     AuditTrail, Readiness

models.py becomes an 85-line re-export shim in dependency order so
existing imports continue to work unchanged. Schema is byte-identical:
__tablename__, column definitions, relationship strings, back_populates,
cascade directives all preserved.

All new sibling files are under the 500-line hard cap; largest is
isms_audit_models.py at 468. No file in compliance/db/ now exceeds
the hard cap.

## Phase 1 Step 3 — infrastructure only

backend-compliance/compliance/{schemas,domain,repositories}/ packages
are created as landing zones with docstrings. compliance/domain/
exports DomainError / NotFoundError / ConflictError / ValidationError /
PermissionError — the base classes services will use to raise
domain-level errors instead of HTTPException.

PHASE1_RUNBOOK.md at backend-compliance/PHASE1_RUNBOOK.md documents
the nine-step execution plan for Phase 1: snapshot baseline,
characterization tests, split models.py (this commit), split schemas.py
(next), extract services, extract repositories, mypy --strict, coverage.

## Verification

  backend-compliance/.venv-phase1: uv python install 3.12 + pip -r requirements.txt
  PYTHONPATH=. pytest compliance/tests/ tests/contracts/
  -> 173 passed, 0 failed, 35 warnings, OpenAPI 360/484 unchanged

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 13:18:29 +02:00

217 lines
8.1 KiB
Python

"""
Generic CRUD Router Factory for Compliance API.
Creates standardized CRUD endpoints (list, create, get, update, delete)
for simple resource tables that follow the tenant-isolated pattern:
- Table has `id`, `tenant_id`, `created_at`, `updated_at` columns
- All queries filtered by tenant_id
Usage:
router = create_crud_router(
prefix="/security-backlog",
table_name="compliance_security_backlog",
tag="security-backlog",
columns=["title", "description", "type", "severity", "status", ...],
search_columns=["title", "description"],
filter_columns=["status", "severity", "type"],
order_by="created_at DESC",
resource_name="Security item",
)
"""
import logging
from datetime import datetime, timezone
from typing import Any, Dict, List, Optional
from fastapi import APIRouter, Depends, HTTPException, Query
from sqlalchemy import text
from sqlalchemy.orm import Session
from classroom_engine.database import get_db
from .tenant_utils import get_tenant_id
from .db_utils import row_to_dict
logger = logging.getLogger(__name__)
def create_crud_router(
prefix: str,
table_name: str,
tag: str,
columns: List[str],
search_columns: Optional[List[str]] = None,
filter_columns: Optional[List[str]] = None,
order_by: str = "created_at DESC",
resource_name: str = "Item",
stats_query: Optional[str] = None,
stats_defaults: Optional[Dict[str, int]] = None,
) -> APIRouter:
"""Create a CRUD router with list, create, get/{id}, update/{id}, delete/{id}.
Args:
prefix: URL prefix (e.g. "/security-backlog")
table_name: PostgreSQL table name
tag: OpenAPI tag
columns: Writable column names (excluding id, tenant_id, created_at, updated_at)
search_columns: Columns to ILIKE-search (default: ["title", "description"])
filter_columns: Columns to filter by exact match via query params
order_by: SQL ORDER BY clause
resource_name: Human-readable name for error messages
stats_query: Optional custom SQL for /stats endpoint (must accept :tenant_id param)
stats_defaults: Default dict for stats when no rows found
"""
router = APIRouter(prefix=prefix, tags=[tag])
_search_cols = search_columns or ["title", "description"]
_filter_cols = filter_columns or []
# ── LIST ──────────────────────────────────────────────────────────────
@router.get("")
async def list_items(
search: Optional[str] = Query(None),
limit: int = Query(100, ge=1, le=500),
offset: int = Query(0, ge=0),
db: Session = Depends(get_db),
tenant_id: str = Depends(get_tenant_id),
**kwargs,
):
where = ["tenant_id = :tenant_id"]
params: Dict[str, Any] = {"tenant_id": tenant_id, "limit": limit, "offset": offset}
# Dynamic filter columns from query string
# We can't use **kwargs with FastAPI easily, so we handle this in a wrapper
if search and _search_cols:
clauses = [f"{c} ILIKE :search" for c in _search_cols]
where.append(f"({' OR '.join(clauses)})")
params["search"] = f"%{search}%"
where_sql = " AND ".join(where)
total_row = db.execute(
text(f"SELECT COUNT(*) FROM {table_name} WHERE {where_sql}"),
params,
).fetchone()
total = total_row[0] if total_row else 0
rows = db.execute(
text(f"""
SELECT * FROM {table_name}
WHERE {where_sql}
ORDER BY {order_by}
LIMIT :limit OFFSET :offset
"""),
params,
).fetchall()
return {"items": [row_to_dict(r) for r in rows], "total": total}
# ── STATS (optional) ─────────────────────────────────────────────────
if stats_query:
@router.get("/stats")
async def get_stats(
db: Session = Depends(get_db),
tenant_id: str = Depends(get_tenant_id),
):
row = db.execute(text(stats_query), {"tenant_id": tenant_id}).fetchone()
if row:
d = dict(row._mapping)
return {k: (v or 0) for k, v in d.items()}
return stats_defaults or {}
# ── CREATE ────────────────────────────────────────────────────────────
@router.post("", status_code=201)
async def create_item(
payload: dict = {},
db: Session = Depends(get_db),
tenant_id: str = Depends(get_tenant_id),
):
col_names = ["tenant_id"]
col_params = [":tenant_id"]
values: Dict[str, Any] = {"tenant_id": tenant_id}
for col in columns:
if col in payload:
col_names.append(col)
col_params.append(f":{col}")
values[col] = payload[col]
row = db.execute(
text(f"""
INSERT INTO {table_name} ({', '.join(col_names)})
VALUES ({', '.join(col_params)})
RETURNING *
"""),
values,
).fetchone()
db.commit()
return row_to_dict(row)
# ── GET BY ID ─────────────────────────────────────────────────────────
@router.get("/{item_id}")
async def get_item(
item_id: str,
db: Session = Depends(get_db),
tenant_id: str = Depends(get_tenant_id),
):
row = db.execute(
text(f"SELECT * FROM {table_name} WHERE id = :id AND tenant_id = :tenant_id"),
{"id": item_id, "tenant_id": tenant_id},
).fetchone()
if not row:
raise HTTPException(status_code=404, detail=f"{resource_name} not found")
return row_to_dict(row)
# ── UPDATE ────────────────────────────────────────────────────────────
@router.put("/{item_id}")
async def update_item(
item_id: str,
payload: dict = {},
db: Session = Depends(get_db),
tenant_id: str = Depends(get_tenant_id),
):
updates: Dict[str, Any] = {
"id": item_id,
"tenant_id": tenant_id,
"updated_at": datetime.now(timezone.utc),
}
set_clauses = ["updated_at = :updated_at"]
for field, value in payload.items():
if field in columns:
updates[field] = value
set_clauses.append(f"{field} = :{field}")
if len(set_clauses) == 1:
raise HTTPException(status_code=400, detail="No fields to update")
row = db.execute(
text(f"""
UPDATE {table_name}
SET {', '.join(set_clauses)}
WHERE id = :id AND tenant_id = :tenant_id
RETURNING *
"""),
updates,
).fetchone()
db.commit()
if not row:
raise HTTPException(status_code=404, detail=f"{resource_name} not found")
return row_to_dict(row)
# ── DELETE ────────────────────────────────────────────────────────────
@router.delete("/{item_id}", status_code=204)
async def delete_item(
item_id: str,
db: Session = Depends(get_db),
tenant_id: str = Depends(get_tenant_id),
):
result = db.execute(
text(f"DELETE FROM {table_name} WHERE id = :id AND tenant_id = :tenant_id"),
{"id": item_id, "tenant_id": tenant_id},
)
db.commit()
if result.rowcount == 0:
raise HTTPException(status_code=404, detail=f"{resource_name} not found")
return router