Squash of branch refactor/phase0-guardrails-and-models-split — 4 commits,
81 files, 173/173 pytest green, OpenAPI contract preserved (360 paths /
484 operations).
## Phase 0 — Architecture guardrails
Three defense-in-depth layers to keep the architecture rules enforced
regardless of who opens Claude Code in this repo:
1. .claude/settings.json PreToolUse hook on Write/Edit blocks any file
that would exceed the 500-line hard cap. Auto-loads in every Claude
session in this repo.
2. scripts/githooks/pre-commit (install via scripts/install-hooks.sh)
enforces the LOC cap locally, freezes migrations/ without
[migration-approved], and protects guardrail files without
[guardrail-change].
3. .gitea/workflows/ci.yaml gains loc-budget + guardrail-integrity +
sbom-scan (syft+grype) jobs, adds mypy --strict for the new Python
packages (compliance/{services,repositories,domain,schemas}), and
tsc --noEmit for admin-compliance + developer-portal.
Per-language conventions documented in AGENTS.python.md, AGENTS.go.md,
AGENTS.typescript.md at the repo root — layering, tooling, and explicit
"what you may NOT do" lists. Root CLAUDE.md is prepended with the six
non-negotiable rules. Each of the 10 services gets a README.md.
scripts/check-loc.sh enforces soft 300 / hard 500 and surfaces the
current baseline of 205 hard + 161 soft violations so Phases 1-4 can
drain it incrementally. CI gates only CHANGED files in PRs so the
legacy baseline does not block unrelated work.
## Deprecation sweep
47 files. Pydantic V1 regex= -> pattern= (2 sites), class Config ->
ConfigDict in source_policy_router.py (schemas.py intentionally skipped;
it is the Phase 1 Step 3 split target). datetime.utcnow() ->
datetime.now(timezone.utc) everywhere including SQLAlchemy default=
callables. All DB columns already declare timezone=True, so this is a
latent-bug fix at the Python side, not a schema change.
DeprecationWarning count dropped from 158 to 35.
## Phase 1 Step 1 — Contract test harness
tests/contracts/test_openapi_baseline.py diffs the live FastAPI /openapi.json
against tests/contracts/openapi.baseline.json on every test run. Fails on
removed paths, removed status codes, or new required request body fields.
Regenerate only via tests/contracts/regenerate_baseline.py after a
consumer-updated contract change. This is the safety harness for all
subsequent refactor commits.
## Phase 1 Step 2 — models.py split (1466 -> 85 LOC shim)
compliance/db/models.py is decomposed into seven sibling aggregate modules
following the existing repo pattern (dsr_models.py, vvt_models.py, ...):
regulation_models.py (134) — Regulation, Requirement
control_models.py (279) — Control, Mapping, Evidence, Risk
ai_system_models.py (141) — AISystem, AuditExport
service_module_models.py (176) — ServiceModule, ModuleRegulation, ModuleRisk
audit_session_models.py (177) — AuditSession, AuditSignOff
isms_governance_models.py (323) — ISMSScope, Context, Policy, Objective, SoA
isms_audit_models.py (468) — Finding, CAPA, MgmtReview, InternalAudit,
AuditTrail, Readiness
models.py becomes an 85-line re-export shim in dependency order so
existing imports continue to work unchanged. Schema is byte-identical:
__tablename__, column definitions, relationship strings, back_populates,
cascade directives all preserved.
All new sibling files are under the 500-line hard cap; largest is
isms_audit_models.py at 468. No file in compliance/db/ now exceeds
the hard cap.
## Phase 1 Step 3 — infrastructure only
backend-compliance/compliance/{schemas,domain,repositories}/ packages
are created as landing zones with docstrings. compliance/domain/
exports DomainError / NotFoundError / ConflictError / ValidationError /
PermissionError — the base classes services will use to raise
domain-level errors instead of HTTPException.
PHASE1_RUNBOOK.md at backend-compliance/PHASE1_RUNBOOK.md documents
the nine-step execution plan for Phase 1: snapshot baseline,
characterization tests, split models.py (this commit), split schemas.py
(next), extract services, extract repositories, mypy --strict, coverage.
## Verification
backend-compliance/.venv-phase1: uv python install 3.12 + pip -r requirements.txt
PYTHONPATH=. pytest compliance/tests/ tests/contracts/
-> 173 passed, 0 failed, 35 warnings, OpenAPI 360/484 unchanged
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
246 lines
8.2 KiB
Python
246 lines
8.2 KiB
Python
"""
|
|
FastAPI routes for Security Backlog Tracking.
|
|
|
|
Endpoints:
|
|
GET /security-backlog — list with filters (status, severity, type, search; limit/offset)
|
|
GET /security-backlog/stats — open, critical, high, overdue counts
|
|
POST /security-backlog — create finding
|
|
PUT /security-backlog/{id} — update finding
|
|
DELETE /security-backlog/{id} — delete finding (204)
|
|
"""
|
|
|
|
import logging
|
|
from datetime import datetime, timezone
|
|
from typing import Optional, Any, Dict
|
|
|
|
from fastapi import APIRouter, Depends, HTTPException, Query
|
|
from pydantic import BaseModel
|
|
from sqlalchemy import text
|
|
from sqlalchemy.orm import Session
|
|
|
|
from classroom_engine.database import get_db
|
|
from .tenant_utils import get_tenant_id as _get_tenant_id
|
|
from .db_utils import row_to_dict as _row_to_dict
|
|
|
|
logger = logging.getLogger(__name__)
|
|
router = APIRouter(prefix="/security-backlog", tags=["security-backlog"])
|
|
|
|
|
|
# =============================================================================
|
|
# Pydantic Schemas
|
|
# =============================================================================
|
|
|
|
class SecurityItemCreate(BaseModel):
|
|
title: str
|
|
description: Optional[str] = None
|
|
type: str = "vulnerability"
|
|
severity: str = "medium"
|
|
status: str = "open"
|
|
source: Optional[str] = None
|
|
cve: Optional[str] = None
|
|
cvss: Optional[float] = None
|
|
affected_asset: Optional[str] = None
|
|
assigned_to: Optional[str] = None
|
|
due_date: Optional[datetime] = None
|
|
remediation: Optional[str] = None
|
|
|
|
|
|
class SecurityItemUpdate(BaseModel):
|
|
title: Optional[str] = None
|
|
description: Optional[str] = None
|
|
type: Optional[str] = None
|
|
severity: Optional[str] = None
|
|
status: Optional[str] = None
|
|
source: Optional[str] = None
|
|
cve: Optional[str] = None
|
|
cvss: Optional[float] = None
|
|
affected_asset: Optional[str] = None
|
|
assigned_to: Optional[str] = None
|
|
due_date: Optional[datetime] = None
|
|
remediation: Optional[str] = None
|
|
|
|
|
|
|
|
# =============================================================================
|
|
# Routes
|
|
# =============================================================================
|
|
|
|
@router.get("")
|
|
async def list_security_items(
|
|
status: Optional[str] = Query(None),
|
|
severity: Optional[str] = Query(None),
|
|
type: Optional[str] = Query(None),
|
|
search: Optional[str] = Query(None),
|
|
limit: int = Query(100, ge=1, le=500),
|
|
offset: int = Query(0, ge=0),
|
|
db: Session = Depends(get_db),
|
|
tenant_id: str = Depends(_get_tenant_id),
|
|
):
|
|
"""List security backlog items with optional filters."""
|
|
|
|
where_clauses = ["tenant_id = :tenant_id"]
|
|
params: Dict[str, Any] = {"tenant_id": tenant_id, "limit": limit, "offset": offset}
|
|
|
|
if status:
|
|
where_clauses.append("status = :status")
|
|
params["status"] = status
|
|
if severity:
|
|
where_clauses.append("severity = :severity")
|
|
params["severity"] = severity
|
|
if type:
|
|
where_clauses.append("type = :type")
|
|
params["type"] = type
|
|
if search:
|
|
where_clauses.append("(title ILIKE :search OR description ILIKE :search)")
|
|
params["search"] = f"%{search}%"
|
|
|
|
where_sql = " AND ".join(where_clauses)
|
|
|
|
total_row = db.execute(
|
|
text(f"SELECT COUNT(*) FROM compliance_security_backlog WHERE {where_sql}"),
|
|
params,
|
|
).fetchone()
|
|
total = total_row[0] if total_row else 0
|
|
|
|
rows = db.execute(
|
|
text(f"""
|
|
SELECT * FROM compliance_security_backlog
|
|
WHERE {where_sql}
|
|
ORDER BY
|
|
CASE severity
|
|
WHEN 'critical' THEN 0
|
|
WHEN 'high' THEN 1
|
|
WHEN 'medium' THEN 2
|
|
ELSE 3
|
|
END,
|
|
CASE status
|
|
WHEN 'open' THEN 0
|
|
WHEN 'in-progress' THEN 1
|
|
WHEN 'accepted-risk' THEN 2
|
|
ELSE 3
|
|
END,
|
|
created_at DESC
|
|
LIMIT :limit OFFSET :offset
|
|
"""),
|
|
params,
|
|
).fetchall()
|
|
|
|
return {
|
|
"items": [_row_to_dict(r) for r in rows],
|
|
"total": total,
|
|
}
|
|
|
|
|
|
@router.get("/stats")
|
|
async def get_security_stats(
|
|
db: Session = Depends(get_db),
|
|
tenant_id: str = Depends(_get_tenant_id),
|
|
):
|
|
"""Return security backlog counts."""
|
|
|
|
rows = db.execute(text("""
|
|
SELECT
|
|
COUNT(*) FILTER (WHERE status = 'open') AS open,
|
|
COUNT(*) FILTER (WHERE status = 'in-progress') AS in_progress,
|
|
COUNT(*) FILTER (WHERE status = 'resolved') AS resolved,
|
|
COUNT(*) FILTER (WHERE status = 'accepted-risk') AS accepted_risk,
|
|
COUNT(*) FILTER (WHERE severity = 'critical' AND status != 'resolved') AS critical,
|
|
COUNT(*) FILTER (WHERE severity = 'high' AND status != 'resolved') AS high,
|
|
COUNT(*) FILTER (
|
|
WHERE due_date IS NOT NULL
|
|
AND due_date < NOW()
|
|
AND status NOT IN ('resolved', 'accepted-risk')
|
|
) AS overdue,
|
|
COUNT(*) AS total
|
|
FROM compliance_security_backlog
|
|
WHERE tenant_id = :tenant_id
|
|
"""), {"tenant_id": tenant_id}).fetchone()
|
|
|
|
if rows:
|
|
d = dict(rows._mapping)
|
|
return {k: (v or 0) for k, v in d.items()}
|
|
return {"open": 0, "in_progress": 0, "resolved": 0, "accepted_risk": 0,
|
|
"critical": 0, "high": 0, "overdue": 0, "total": 0}
|
|
|
|
|
|
@router.post("", status_code=201)
|
|
async def create_security_item(
|
|
payload: SecurityItemCreate,
|
|
db: Session = Depends(get_db),
|
|
tenant_id: str = Depends(_get_tenant_id),
|
|
):
|
|
"""Create a new security backlog item."""
|
|
|
|
row = db.execute(text("""
|
|
INSERT INTO compliance_security_backlog
|
|
(tenant_id, title, description, type, severity, status,
|
|
source, cve, cvss, affected_asset, assigned_to, due_date, remediation)
|
|
VALUES
|
|
(:tenant_id, :title, :description, :type, :severity, :status,
|
|
:source, :cve, :cvss, :affected_asset, :assigned_to, :due_date, :remediation)
|
|
RETURNING *
|
|
"""), {
|
|
"tenant_id": tenant_id,
|
|
"title": payload.title,
|
|
"description": payload.description,
|
|
"type": payload.type,
|
|
"severity": payload.severity,
|
|
"status": payload.status,
|
|
"source": payload.source,
|
|
"cve": payload.cve,
|
|
"cvss": payload.cvss,
|
|
"affected_asset": payload.affected_asset,
|
|
"assigned_to": payload.assigned_to,
|
|
"due_date": payload.due_date,
|
|
"remediation": payload.remediation,
|
|
}).fetchone()
|
|
db.commit()
|
|
return _row_to_dict(row)
|
|
|
|
|
|
@router.put("/{item_id}")
|
|
async def update_security_item(
|
|
item_id: str,
|
|
payload: SecurityItemUpdate,
|
|
db: Session = Depends(get_db),
|
|
tenant_id: str = Depends(_get_tenant_id),
|
|
):
|
|
"""Update a security backlog item."""
|
|
|
|
updates: Dict[str, Any] = {"id": item_id, "tenant_id": tenant_id, "updated_at": datetime.now(timezone.utc)}
|
|
set_clauses = ["updated_at = :updated_at"]
|
|
|
|
for field, value in payload.model_dump(exclude_unset=True).items():
|
|
updates[field] = value
|
|
set_clauses.append(f"{field} = :{field}")
|
|
|
|
if len(set_clauses) == 1:
|
|
raise HTTPException(status_code=400, detail="No fields to update")
|
|
|
|
row = db.execute(text(f"""
|
|
UPDATE compliance_security_backlog
|
|
SET {', '.join(set_clauses)}
|
|
WHERE id = :id AND tenant_id = :tenant_id
|
|
RETURNING *
|
|
"""), updates).fetchone()
|
|
db.commit()
|
|
|
|
if not row:
|
|
raise HTTPException(status_code=404, detail="Security item not found")
|
|
return _row_to_dict(row)
|
|
|
|
|
|
@router.delete("/{item_id}", status_code=204)
|
|
async def delete_security_item(
|
|
item_id: str,
|
|
db: Session = Depends(get_db),
|
|
tenant_id: str = Depends(_get_tenant_id),
|
|
):
|
|
result = db.execute(text("""
|
|
DELETE FROM compliance_security_backlog
|
|
WHERE id = :id AND tenant_id = :tenant_id
|
|
"""), {"id": item_id, "tenant_id": tenant_id})
|
|
db.commit()
|
|
if result.rowcount == 0:
|
|
raise HTTPException(status_code=404, detail="Security item not found")
|