refactor: phase 0 guardrails + phase 1 step 2 (models.py split)

Squash of branch refactor/phase0-guardrails-and-models-split — 4 commits,
81 files, 173/173 pytest green, OpenAPI contract preserved (360 paths /
484 operations).

## Phase 0 — Architecture guardrails

Three defense-in-depth layers to keep the architecture rules enforced
regardless of who opens Claude Code in this repo:

  1. .claude/settings.json PreToolUse hook on Write/Edit blocks any file
     that would exceed the 500-line hard cap. Auto-loads in every Claude
     session in this repo.
  2. scripts/githooks/pre-commit (install via scripts/install-hooks.sh)
     enforces the LOC cap locally, freezes migrations/ without
     [migration-approved], and protects guardrail files without
     [guardrail-change].
  3. .gitea/workflows/ci.yaml gains loc-budget + guardrail-integrity +
     sbom-scan (syft+grype) jobs, adds mypy --strict for the new Python
     packages (compliance/{services,repositories,domain,schemas}), and
     tsc --noEmit for admin-compliance + developer-portal.

Per-language conventions documented in AGENTS.python.md, AGENTS.go.md,
AGENTS.typescript.md at the repo root — layering, tooling, and explicit
"what you may NOT do" lists. Root CLAUDE.md is prepended with the six
non-negotiable rules. Each of the 10 services gets a README.md.

scripts/check-loc.sh enforces soft 300 / hard 500 and surfaces the
current baseline of 205 hard + 161 soft violations so Phases 1-4 can
drain it incrementally. CI gates only CHANGED files in PRs so the
legacy baseline does not block unrelated work.

## Deprecation sweep

47 files. Pydantic V1 regex= -> pattern= (2 sites), class Config ->
ConfigDict in source_policy_router.py (schemas.py intentionally skipped;
it is the Phase 1 Step 3 split target). datetime.utcnow() ->
datetime.now(timezone.utc) everywhere including SQLAlchemy default=
callables. All DB columns already declare timezone=True, so this is a
latent-bug fix at the Python side, not a schema change.

DeprecationWarning count dropped from 158 to 35.

## Phase 1 Step 1 — Contract test harness

tests/contracts/test_openapi_baseline.py diffs the live FastAPI /openapi.json
against tests/contracts/openapi.baseline.json on every test run. Fails on
removed paths, removed status codes, or new required request body fields.
Regenerate only via tests/contracts/regenerate_baseline.py after a
consumer-updated contract change. This is the safety harness for all
subsequent refactor commits.

## Phase 1 Step 2 — models.py split (1466 -> 85 LOC shim)

compliance/db/models.py is decomposed into seven sibling aggregate modules
following the existing repo pattern (dsr_models.py, vvt_models.py, ...):

  regulation_models.py       (134) — Regulation, Requirement
  control_models.py          (279) — Control, Mapping, Evidence, Risk
  ai_system_models.py        (141) — AISystem, AuditExport
  service_module_models.py   (176) — ServiceModule, ModuleRegulation, ModuleRisk
  audit_session_models.py    (177) — AuditSession, AuditSignOff
  isms_governance_models.py  (323) — ISMSScope, Context, Policy, Objective, SoA
  isms_audit_models.py       (468) — Finding, CAPA, MgmtReview, InternalAudit,
                                     AuditTrail, Readiness

models.py becomes an 85-line re-export shim in dependency order so
existing imports continue to work unchanged. Schema is byte-identical:
__tablename__, column definitions, relationship strings, back_populates,
cascade directives all preserved.

All new sibling files are under the 500-line hard cap; largest is
isms_audit_models.py at 468. No file in compliance/db/ now exceeds
the hard cap.

## Phase 1 Step 3 — infrastructure only

backend-compliance/compliance/{schemas,domain,repositories}/ packages
are created as landing zones with docstrings. compliance/domain/
exports DomainError / NotFoundError / ConflictError / ValidationError /
PermissionError — the base classes services will use to raise
domain-level errors instead of HTTPException.

PHASE1_RUNBOOK.md at backend-compliance/PHASE1_RUNBOOK.md documents
the nine-step execution plan for Phase 1: snapshot baseline,
characterization tests, split models.py (this commit), split schemas.py
(next), extract services, extract repositories, mypy --strict, coverage.

## Verification

  backend-compliance/.venv-phase1: uv python install 3.12 + pip -r requirements.txt
  PYTHONPATH=. pytest compliance/tests/ tests/contracts/
  -> 173 passed, 0 failed, 35 warnings, OpenAPI 360/484 unchanged

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Sharang Parnerkar
2026-04-07 13:18:29 +02:00
parent 1dfea51919
commit 3320ef94fc
84 changed files with 52849 additions and 1731 deletions

View File

@@ -0,0 +1,181 @@
# Phase 1 Runbook — backend-compliance refactor
This document is the step-by-step execution guide for Phase 1 of the repo refactor plan at `~/.claude/plans/vectorized-purring-barto.md`. It exists because the refactor must be driven from a session that can actually run `pytest` against the service, and every step must be verified green before moving to the next.
## Prerequisites
- Python 3.12 venv with `backend-compliance/requirements.txt` installed.
- Local Postgres reachable via `COMPLIANCE_DATABASE_URL` (use the compose db).
- Existing 48 pytest test files pass from a clean checkout: `pytest compliance/tests/ -v` → all green. **Do not proceed until this is true.**
## Step 0 — Record the baseline
```bash
cd backend-compliance
pytest compliance/tests/ -v --tb=short | tee /tmp/baseline.txt
pytest --cov=compliance --cov-report=term | tee /tmp/baseline-coverage.txt
python tests/contracts/regenerate_baseline.py # creates openapi.baseline.json
git add tests/contracts/openapi.baseline.json
git commit -m "phase1: pin OpenAPI baseline before refactor"
```
The baseline file is the contract. From this point forward, `pytest tests/contracts/` MUST stay green.
## Step 1 — Characterization tests (before any code move)
For each oversized route file we will refactor, add a happy-path + 1-error-path test **before** touching the source. These are called "characterization tests" and their purpose is to freeze current observable behavior so the refactor cannot change it silently.
Oversized route files to cover (ordered by size):
| File | LOC | Endpoints to cover |
|---|---:|---|
| `compliance/api/isms_routes.py` | 1676 | one happy + one 4xx per route |
| `compliance/api/dsr_routes.py` | 1176 | same |
| `compliance/api/vvt_routes.py` | *N* | same |
| `compliance/api/dsfa_routes.py` | *N* | same |
| `compliance/api/tom_routes.py` | *N* | same |
| `compliance/api/schemas.py` | 1899 | N/A (covered transitively) |
| `compliance/db/models.py` | 1466 | N/A (covered by existing + route tests) |
| `compliance/db/repository.py` | 1547 | add unit tests per repo class as they are extracted |
Use `httpx.AsyncClient` + factory fixtures; see `AGENTS.python.md`. Place under `tests/integration/test_<domain>_contract.py`.
Commit: `phase1: characterization tests for <domain> routes`.
## Step 2 — Split `compliance/db/models.py` (1466 → <500 per file)
⚠️ **Atomic step.** A `compliance/db/models/` package CANNOT coexist with the existing `compliance/db/models.py` module — Python's import system shadows the module with the package, breaking every `from compliance.db.models import X` call. The directory skeleton was intentionally NOT pre-created for this reason. Do the following in **one commit**:
1. Create `compliance/db/models/` directory with `__init__.py` (re-export shim — see template below).
2. Move aggregate model classes into `compliance/db/models/<aggregate>.py` modules.
3. Delete the old `compliance/db/models.py` file in the same commit.
Strategy uses a **re-export shim** so no import sites change:
1. For each aggregate, create `compliance/db/models/<aggregate>.py` containing the model classes. Copy verbatim; do not rename `__tablename__`, columns, or relationship strings.
2. Aggregate suggestions (verify by reading `models.py`):
- `dsr.py` (DSR requests, exports)
- `dsfa.py`
- `vvt.py`
- `tom.py`
- `ai.py` (AI systems, compliance checks)
- `consent.py`
- `evidence.py`
- `vendor.py`
- `audit.py`
- `policy.py`
- `project.py`
3. After every aggregate is moved, replace `compliance/db/models.py` with:
```python
"""Re-export shim — see compliance.db.models package."""
from compliance.db.models.dsr import * # noqa: F401,F403
from compliance.db.models.dsfa import * # noqa: F401,F403
# ... one per module
```
This keeps `from compliance.db.models import XYZ` working everywhere it's used today.
4. Run `pytest` after every move. Green → commit. Red → revert that move and investigate.
5. Existing aggregate-level files (`compliance/db/dsr_models.py`, `vvt_models.py`, `tom_models.py`, etc.) should be folded into the new `compliance/db/models/` package in the same pass — do not leave two parallel naming conventions.
**Do not** add `__init__.py` star-imports that change `Base.metadata` discovery order. Alembic's autogenerate depends on it. Verify via: `alembic check` if the env is set up.
## Step 3 — Split `compliance/api/schemas.py` (1899 → per domain)
Mirror the models split:
1. For each domain, create `compliance/schemas/<domain>.py` with the Pydantic models.
2. Replace `compliance/api/schemas.py` with a re-export shim.
3. Keep `Create`/`Update`/`Read` variants separated; do not merge them into unions.
4. Run `pytest` + contract test after each domain. Green → commit.
## Step 4 — Extract services (router → service delegation)
For each route file > 500 LOC, pull handler bodies into a service class under `compliance/services/<domain>_service.py` (new-style domain services, not the utility `compliance/services/` modules that already exist — consider renaming those to `compliance/services/_legacy/` if collisions arise).
Router handlers become:
```python
@router.post("/dsr/requests", response_model=DSRRequestRead, status_code=201)
async def create_dsr_request(
payload: DSRRequestCreate,
service: DSRService = Depends(get_dsr_service),
tenant_id: UUID = Depends(get_tenant_id),
) -> DSRRequestRead:
try:
return await service.create(tenant_id, payload)
except ConflictError as exc:
raise HTTPException(409, str(exc)) from exc
except NotFoundError as exc:
raise HTTPException(404, str(exc)) from exc
```
Rules:
- Handler body ≤ 30 LOC.
- Service raises domain errors (`compliance.domain`), never `HTTPException`.
- Inject service via `Depends` on a factory that wires the repository.
Run tests after each router is thinned. Contract test must stay green.
## Step 5 — Extract repositories
`compliance/db/repository.py` (1547) and `compliance/db/isms_repository.py` (838) split into:
```
compliance/repositories/
├── dsr_repository.py
├── dsfa_repository.py
├── vvt_repository.py
├── isms_repository.py # <500 LOC, split if needed
└── ...
```
Each repository class:
- Takes `AsyncSession` (or equivalent) in constructor.
- Exposes intent-named methods (`get_pending_for_tenant`, not `select_where`).
- Returns ORM instances or domain VOs. No `Row`.
- No business logic.
Unit-test every repo class against the compose Postgres with a transactional fixture (begin → rollback).
## Step 6 — mypy --strict on new packages
CI already runs `mypy --strict` against `compliance/{services,repositories,domain,schemas}/`. After every extraction, verify locally:
```bash
mypy --strict --ignore-missing-imports compliance/schemas compliance/repositories compliance/domain compliance/services
```
If you have type errors, fix them in the extracted module. **Do not** add `# type: ignore` blanket waivers. If a third-party lib is poorly typed, add it to `[mypy.overrides]` in `pyproject.toml`/`mypy.ini` with a one-line rationale.
## Step 7 — Expand test coverage
- Unit tests per service (mocked repo).
- Integration tests per repository (real db, transactional).
- Contract test stays green.
- Target: 80% coverage on new code. Never decrease the service baseline.
## Step 8 — Guardrail enforcement
After Phase 1 completes, `compliance/db/models.py`, `compliance/db/repository.py`, and `compliance/api/schemas.py` are either re-export shims (≤50 LOC each) or deleted. No file in `backend-compliance/compliance/` exceeds 500 LOC. Run:
```bash
../scripts/check-loc.sh backend-compliance/
```
Any remaining hard violations → document in `.claude/rules/loc-exceptions.txt` with rationale, or keep splitting.
## Done when
- `pytest compliance/tests/ tests/ -v` all green.
- `pytest tests/contracts/` green — OpenAPI has no removals, no renames, no new required request fields.
- Coverage ≥ baseline.
- `mypy --strict` clean on new packages.
- `scripts/check-loc.sh backend-compliance/` reports 0 hard violations in new/touched files (legacy allowlisted in `loc-exceptions.txt` only with rationale).
- CI all green on PR.
## Pitfalls
- **Do not change `__tablename__` or column names.** Even a rename breaks the DB contract.
- **Do not change relationship back_populates / backref strings.** SQLAlchemy resolves these by name at mapper configuration.
- **Do not change route paths or pydantic field names.** Contract test will catch most — but JSON field aliasing (`Field(alias=...)`) is easy to break accidentally.
- **Do not eagerly reformat unrelated code.** Keep the diff reviewable. One PR per major step.
- **Do not bypass the pre-commit hook.** If a file legitimately must be >500 LOC during an intermediate step, squash commits at the end so the final state is clean.

View File

@@ -0,0 +1,55 @@
# backend-compliance
Python/FastAPI service implementing the DSGVO compliance API: DSR, DSFA, consent, controls, risks, evidence, audit, vendor management, ISMS, change requests, document generation.
**Port:** `8002` (container: `bp-compliance-backend`)
**Stack:** Python 3.12, FastAPI, SQLAlchemy 2.x, Alembic, Keycloak auth.
## Architecture (target — Phase 1)
```
compliance/
├── api/ # Routers (thin, ≤30 LOC per handler)
├── services/ # Business logic
├── repositories/ # DB access
├── domain/ # Value objects, domain errors
├── schemas/ # Pydantic models, split per domain
└── db/models/ # SQLAlchemy ORM, one module per aggregate
```
See `../AGENTS.python.md` for the full convention and `../.claude/rules/architecture.md` for the non-negotiable rules.
## Run locally
```bash
cd backend-compliance
pip install -r requirements.txt
export COMPLIANCE_DATABASE_URL=... # Postgres (Hetzner or local)
uvicorn main:app --reload --port 8002
```
## Tests
```bash
pytest compliance/tests/ -v
pytest --cov=compliance --cov-report=term-missing
```
Layout: `tests/unit/`, `tests/integration/`, `tests/contracts/`. Contract tests diff `/openapi.json` against `tests/contracts/openapi.baseline.json`.
## Public API surface
404+ endpoints across `/api/v1/*`. Grouped by domain: `ai`, `audit`, `consent`, `dsfa`, `dsr`, `gdpr`, `vendor`, `evidence`, `change-requests`, `generation`, `projects`, `company-profile`, `isms`. Every path is a contract — see the "Public endpoints" rule in the root `CLAUDE.md`.
## Environment
| Var | Purpose |
|-----|---------|
| `COMPLIANCE_DATABASE_URL` | Postgres DSN, `sslmode=require` |
| `KEYCLOAK_*` | Auth verification |
| `QDRANT_URL`, `QDRANT_API_KEY` | Vector search |
| `CORE_VALKEY_URL` | Session cache |
## Don't touch
Database schema, `__tablename__`, column names, existing migrations under `migrations/`. See root `CLAUDE.md` rule 3.

View File

@@ -186,7 +186,7 @@ async def update_ai_system(
if hasattr(system, key):
setattr(system, key, value)
system.updated_at = datetime.utcnow()
system.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(system)
@@ -266,7 +266,7 @@ async def assess_ai_system(
except ValueError:
system.classification = AIClassificationEnum.UNCLASSIFIED
system.assessment_date = datetime.utcnow()
system.assessment_date = datetime.now(timezone.utc)
system.assessment_result = assessment_result
system.obligations = _derive_obligations(classification)
system.risk_factors = assessment_result.get("risk_factors", [])

View File

@@ -9,7 +9,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List
from uuid import uuid4
import hashlib
@@ -204,7 +204,7 @@ async def start_audit_session(
)
session.status = AuditSessionStatusEnum.IN_PROGRESS
session.started_at = datetime.utcnow()
session.started_at = datetime.now(timezone.utc)
db.commit()
return {"success": True, "message": "Audit session started", "status": "in_progress"}
@@ -229,7 +229,7 @@ async def complete_audit_session(
)
session.status = AuditSessionStatusEnum.COMPLETED
session.completed_at = datetime.utcnow()
session.completed_at = datetime.now(timezone.utc)
db.commit()
return {"success": True, "message": "Audit session completed", "status": "completed"}
@@ -482,7 +482,7 @@ async def sign_off_item(
# Update existing sign-off
signoff.result = result_enum
signoff.notes = request.notes
signoff.updated_at = datetime.utcnow()
signoff.updated_at = datetime.now(timezone.utc)
else:
# Create new sign-off
signoff = AuditSignOffDB(
@@ -497,11 +497,11 @@ async def sign_off_item(
# Create digital signature if requested
signature = None
if request.sign:
timestamp = datetime.utcnow().isoformat()
timestamp = datetime.now(timezone.utc).isoformat()
data = f"{result_enum.value}|{requirement_id}|{session.auditor_name}|{timestamp}"
signature = hashlib.sha256(data.encode()).hexdigest()
signoff.signature_hash = signature
signoff.signed_at = datetime.utcnow()
signoff.signed_at = datetime.now(timezone.utc)
signoff.signed_by = session.auditor_name
# Update session statistics
@@ -523,7 +523,7 @@ async def sign_off_item(
# Auto-start session if this is the first sign-off
if session.status == AuditSessionStatusEnum.DRAFT:
session.status = AuditSessionStatusEnum.IN_PROGRESS
session.started_at = datetime.utcnow()
session.started_at = datetime.now(timezone.utc)
db.commit()
db.refresh(signoff)
@@ -587,7 +587,7 @@ async def get_sign_off(
@router.get("/sessions/{session_id}/report/pdf")
async def generate_audit_pdf_report(
session_id: str,
language: str = Query("de", regex="^(de|en)$"),
language: str = Query("de", pattern="^(de|en)$"),
include_signatures: bool = Query(True),
db: Session = Depends(get_db),
):

View File

@@ -6,7 +6,7 @@ Public SDK-Endpoints (fuer Einbettung) + Admin-Endpoints (Konfiguration & Stats)
import uuid
import hashlib
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone
from typing import Optional, List
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -206,8 +206,8 @@ async def record_consent(
existing.ip_hash = ip_hash
existing.user_agent = body.user_agent
existing.consent_string = body.consent_string
existing.expires_at = datetime.utcnow() + timedelta(days=365)
existing.updated_at = datetime.utcnow()
existing.expires_at = datetime.now(timezone.utc) + timedelta(days=365)
existing.updated_at = datetime.now(timezone.utc)
db.flush()
_log_banner_audit(
@@ -227,7 +227,7 @@ async def record_consent(
ip_hash=ip_hash,
user_agent=body.user_agent,
consent_string=body.consent_string,
expires_at=datetime.utcnow() + timedelta(days=365),
expires_at=datetime.now(timezone.utc) + timedelta(days=365),
)
db.add(consent)
db.flush()
@@ -476,7 +476,7 @@ async def update_site_config(
if val is not None:
setattr(config, field, val)
config.updated_at = datetime.utcnow()
config.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(config)
return _site_config_to_dict(config)

View File

@@ -11,7 +11,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Header
@@ -173,7 +173,7 @@ async def update_consent_template(
set_clauses = ", ".join(f"{k} = :{k}" for k in updates)
updates["id"] = template_id
updates["tenant_id"] = tenant_id
updates["now"] = datetime.utcnow()
updates["now"] = datetime.now(timezone.utc)
row = db.execute(
text(f"""

View File

@@ -186,7 +186,7 @@ async def list_jobs(
@router.get("/generate/review-queue")
async def get_review_queue(
release_state: str = Query("needs_review", regex="^(needs_review|too_close|duplicate)$"),
release_state: str = Query("needs_review", pattern="^(needs_review|too_close|duplicate)$"),
limit: int = Query(50, ge=1, le=200),
):
"""Get controls that need manual review."""

View File

@@ -20,7 +20,7 @@ Usage:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Any, Dict, List, Optional
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -171,7 +171,7 @@ def create_crud_router(
updates: Dict[str, Any] = {
"id": item_id,
"tenant_id": tenant_id,
"updated_at": datetime.utcnow(),
"updated_at": datetime.now(timezone.utc),
}
set_clauses = ["updated_at = :updated_at"]

View File

@@ -10,7 +10,7 @@ Endpoints:
"""
import logging
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone
from calendar import month_abbr
from typing import Optional
@@ -167,7 +167,7 @@ async def get_executive_dashboard(db: Session = Depends(get_db)):
# Trend data — only show current score, no simulated history
trend_data = []
if total > 0:
now = datetime.utcnow()
now = datetime.now(timezone.utc)
trend_data.append(TrendDataPoint(
date=now.strftime("%Y-%m-%d"),
score=round(score, 1),
@@ -204,7 +204,7 @@ async def get_executive_dashboard(db: Session = Depends(get_db)):
# Get upcoming deadlines
controls = ctrl_repo.get_all()
upcoming_deadlines = []
today = datetime.utcnow().date()
today = datetime.now(timezone.utc).date()
for ctrl in controls:
if ctrl.next_review_at:
@@ -280,7 +280,7 @@ async def get_executive_dashboard(db: Session = Depends(get_db)):
top_risks=top_risks,
upcoming_deadlines=upcoming_deadlines,
team_workload=team_workload,
last_updated=datetime.utcnow().isoformat(),
last_updated=datetime.now(timezone.utc).isoformat(),
)
@@ -305,7 +305,7 @@ async def get_compliance_trend(
# Trend data — only current score, no simulated history
trend_data = []
if total > 0:
now = datetime.utcnow()
now = datetime.now(timezone.utc)
trend_data.append({
"date": now.strftime("%Y-%m-%d"),
"score": round(current_score, 1),
@@ -318,7 +318,7 @@ async def get_compliance_trend(
"current_score": round(current_score, 1),
"trend": trend_data,
"period_months": months,
"generated_at": datetime.utcnow().isoformat(),
"generated_at": datetime.now(timezone.utc).isoformat(),
}

View File

@@ -20,7 +20,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -691,7 +691,7 @@ async def update_dsfa_status(
params: dict = {
"id": dsfa_id, "tid": tid,
"status": request.status,
"approved_at": datetime.utcnow() if request.status == "approved" else None,
"approved_at": datetime.now(timezone.utc) if request.status == "approved" else None,
"approved_by": request.approved_by,
}
row = db.execute(
@@ -906,7 +906,7 @@ async def export_dsfa_json(
dsfa_data = _dsfa_to_response(row)
return {
"exported_at": datetime.utcnow().isoformat(),
"exported_at": datetime.now(timezone.utc).isoformat(),
"format": format,
"dsfa": dsfa_data,
}

View File

@@ -7,7 +7,7 @@ Native Python/FastAPI Implementierung, ersetzt Go consent-service Proxy.
import io
import csv
import uuid
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone
from typing import Optional, List, Dict, Any
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -168,7 +168,7 @@ def _get_tenant(x_tenant_id: Optional[str] = Header(None, alias='X-Tenant-ID'))
def _generate_request_number(db: Session, tenant_id: str) -> str:
"""Generate next request number: DSR-YYYY-NNNNNN"""
year = datetime.utcnow().year
year = datetime.now(timezone.utc).year
try:
result = db.execute(text("SELECT nextval('compliance_dsr_request_number_seq')"))
seq = result.scalar()
@@ -275,7 +275,7 @@ async def create_dsr(
if body.priority and body.priority not in VALID_PRIORITIES:
raise HTTPException(status_code=400, detail=f"Invalid priority. Must be one of: {VALID_PRIORITIES}")
now = datetime.utcnow()
now = datetime.now(timezone.utc)
deadline_days = DEADLINE_DAYS.get(body.request_type, 30)
request_number = _generate_request_number(db, tenant_id)
@@ -348,7 +348,7 @@ async def list_dsrs(
query = query.filter(DSRRequestDB.priority == priority)
if overdue_only:
query = query.filter(
DSRRequestDB.deadline_at < datetime.utcnow(),
DSRRequestDB.deadline_at < datetime.now(timezone.utc),
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
)
if search:
@@ -399,7 +399,7 @@ async def get_dsr_stats(
by_type[t] = base.filter(DSRRequestDB.request_type == t).count()
# Overdue
now = datetime.utcnow()
now = datetime.now(timezone.utc)
overdue = base.filter(
DSRRequestDB.deadline_at < now,
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
@@ -459,7 +459,7 @@ async def export_dsrs(
if format == "json":
return {
"exported_at": datetime.utcnow().isoformat(),
"exported_at": datetime.now(timezone.utc).isoformat(),
"total": len(dsrs),
"requests": [_dsr_to_dict(d) for d in dsrs],
}
@@ -506,7 +506,7 @@ async def process_deadlines(
db: Session = Depends(get_db),
):
"""Verarbeitet Fristen und markiert ueberfaellige DSRs."""
now = datetime.utcnow()
now = datetime.now(timezone.utc)
tid = uuid.UUID(tenant_id)
overdue = db.query(DSRRequestDB).filter(
@@ -714,7 +714,7 @@ async def publish_template_version(
if not version:
raise HTTPException(status_code=404, detail="Version not found")
now = datetime.utcnow()
now = datetime.now(timezone.utc)
version.status = "published"
version.published_at = now
version.published_by = "admin"
@@ -766,7 +766,7 @@ async def update_dsr(
dsr.internal_notes = body.internal_notes
if body.assigned_to is not None:
dsr.assigned_to = body.assigned_to
dsr.assigned_at = datetime.utcnow()
dsr.assigned_at = datetime.now(timezone.utc)
if body.request_text is not None:
dsr.request_text = body.request_text
if body.affected_systems is not None:
@@ -778,7 +778,7 @@ async def update_dsr(
if body.objection_details is not None:
dsr.objection_details = body.objection_details
dsr.updated_at = datetime.utcnow()
dsr.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(dsr)
return _dsr_to_dict(dsr)
@@ -797,7 +797,7 @@ async def delete_dsr(
_record_history(db, dsr, "cancelled", comment="DSR storniert")
dsr.status = "cancelled"
dsr.updated_at = datetime.utcnow()
dsr.updated_at = datetime.now(timezone.utc)
db.commit()
return {"success": True, "message": "DSR cancelled"}
@@ -820,7 +820,7 @@ async def change_status(
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
_record_history(db, dsr, body.status, comment=body.comment)
dsr.status = body.status
dsr.updated_at = datetime.utcnow()
dsr.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(dsr)
return _dsr_to_dict(dsr)
@@ -835,7 +835,7 @@ async def verify_identity(
):
"""Verifiziert die Identitaet des Antragstellers."""
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
now = datetime.utcnow()
now = datetime.now(timezone.utc)
dsr.identity_verified = True
dsr.verification_method = body.method
@@ -868,9 +868,9 @@ async def assign_dsr(
"""Weist eine DSR einem Bearbeiter zu."""
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
dsr.assigned_to = body.assignee_id
dsr.assigned_at = datetime.utcnow()
dsr.assigned_at = datetime.now(timezone.utc)
dsr.assigned_by = "admin"
dsr.updated_at = datetime.utcnow()
dsr.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(dsr)
return _dsr_to_dict(dsr)
@@ -888,7 +888,7 @@ async def extend_deadline(
if dsr.status in ("completed", "rejected", "cancelled"):
raise HTTPException(status_code=400, detail="Cannot extend deadline for closed DSR")
now = datetime.utcnow()
now = datetime.now(timezone.utc)
current_deadline = dsr.extended_deadline_at or dsr.deadline_at
new_deadline = current_deadline + timedelta(days=body.days or 60)
@@ -916,7 +916,7 @@ async def complete_dsr(
if dsr.status in ("completed", "cancelled"):
raise HTTPException(status_code=400, detail="DSR already completed or cancelled")
now = datetime.utcnow()
now = datetime.now(timezone.utc)
_record_history(db, dsr, "completed", comment=body.summary)
dsr.status = "completed"
dsr.completed_at = now
@@ -941,7 +941,7 @@ async def reject_dsr(
if dsr.status in ("completed", "rejected", "cancelled"):
raise HTTPException(status_code=400, detail="DSR already closed")
now = datetime.utcnow()
now = datetime.now(timezone.utc)
_record_history(db, dsr, "rejected", comment=f"{body.reason} ({body.legal_basis})")
dsr.status = "rejected"
dsr.rejection_reason = body.reason
@@ -1024,7 +1024,7 @@ async def send_communication(
):
"""Sendet eine Kommunikation."""
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
now = datetime.utcnow()
now = datetime.now(timezone.utc)
comm = DSRCommunicationDB(
tenant_id=uuid.UUID(tenant_id),
@@ -1158,7 +1158,7 @@ async def update_exception_check(
check.applies = body.applies
check.notes = body.notes
check.checked_by = "admin"
check.checked_at = datetime.utcnow()
check.checked_at = datetime.now(timezone.utc)
db.commit()
db.refresh(check)

View File

@@ -15,7 +15,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -131,7 +131,7 @@ async def upsert_catalog(
if record:
record.selected_data_point_ids = request.selected_data_point_ids
record.custom_data_points = request.custom_data_points
record.updated_at = datetime.utcnow()
record.updated_at = datetime.now(timezone.utc)
else:
record = EinwilligungenCatalogDB(
tenant_id=tenant_id,
@@ -184,7 +184,7 @@ async def upsert_company(
if record:
record.data = request.data
record.updated_at = datetime.utcnow()
record.updated_at = datetime.now(timezone.utc)
else:
record = EinwilligungenCompanyDB(tenant_id=tenant_id, data=request.data)
db.add(record)
@@ -233,7 +233,7 @@ async def upsert_cookies(
if record:
record.categories = request.categories
record.config = request.config
record.updated_at = datetime.utcnow()
record.updated_at = datetime.now(timezone.utc)
else:
record = EinwilligungenCookiesDB(
tenant_id=tenant_id,
@@ -374,7 +374,7 @@ async def create_consent(
user_id=request.user_id,
data_point_id=request.data_point_id,
granted=request.granted,
granted_at=datetime.utcnow(),
granted_at=datetime.now(timezone.utc),
consent_version=request.consent_version,
source=request.source,
ip_address=request.ip_address,
@@ -443,7 +443,7 @@ async def revoke_consent(
if consent.revoked_at:
raise HTTPException(status_code=400, detail="Consent is already revoked")
consent.revoked_at = datetime.utcnow()
consent.revoked_at = datetime.now(timezone.utc)
_record_history(db, consent, 'revoked')
db.commit()
db.refresh(consent)

View File

@@ -6,7 +6,7 @@ Inklusive Versionierung, Approval-Workflow, Vorschau und Send-Logging.
"""
import uuid
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, Dict
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -271,7 +271,7 @@ async def update_settings(
if val is not None:
setattr(settings, field, val)
settings.updated_at = datetime.utcnow()
settings.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(settings)
@@ -638,7 +638,7 @@ async def submit_version(
raise HTTPException(status_code=400, detail="Only draft versions can be submitted")
v.status = "review"
v.submitted_at = datetime.utcnow()
v.submitted_at = datetime.now(timezone.utc)
v.submitted_by = "admin"
db.commit()
db.refresh(v)
@@ -730,7 +730,7 @@ async def publish_version(
if v.status not in ("approved", "review", "draft"):
raise HTTPException(status_code=400, detail="Version cannot be published")
now = datetime.utcnow()
now = datetime.now(timezone.utc)
v.status = "published"
v.published_at = now
v.published_by = "admin"

View File

@@ -12,7 +12,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -244,7 +244,7 @@ async def update_escalation(
set_clauses = ", ".join(f"{k} = :{k}" for k in updates.keys())
updates["id"] = escalation_id
updates["updated_at"] = datetime.utcnow()
updates["updated_at"] = datetime.now(timezone.utc)
row = db.execute(
text(
@@ -277,7 +277,7 @@ async def update_status(
resolved_at = request.resolved_at
if request.status in ('resolved', 'closed') and resolved_at is None:
resolved_at = datetime.utcnow()
resolved_at = datetime.now(timezone.utc)
row = db.execute(
text(
@@ -288,7 +288,7 @@ async def update_status(
{
"status": request.status,
"resolved_at": resolved_at,
"updated_at": datetime.utcnow(),
"updated_at": datetime.now(timezone.utc),
"id": escalation_id,
},
).fetchone()

View File

@@ -10,7 +10,7 @@ Endpoints:
import logging
import os
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone
from typing import Optional
from collections import defaultdict
import uuid as uuid_module
@@ -370,8 +370,8 @@ def _store_evidence(
mime_type="application/json",
source="ci_pipeline",
ci_job_id=ci_job_id,
valid_from=datetime.utcnow(),
valid_until=datetime.utcnow() + timedelta(days=90),
valid_from=datetime.now(timezone.utc),
valid_until=datetime.now(timezone.utc) + timedelta(days=90),
status=EvidenceStatusEnum(parsed["evidence_status"]),
)
db.add(evidence)
@@ -455,7 +455,7 @@ def _update_risks(db: Session, *, source: str, control_id: str, ci_job_id: str,
tool=source,
control_id=control_id,
evidence_type=f"ci_{source}",
timestamp=datetime.utcnow().isoformat(),
timestamp=datetime.now(timezone.utc).isoformat(),
commit_sha=report_data.get("commit_sha", "unknown") if report_data else "unknown",
ci_job_id=ci_job_id,
findings=findings_detail,
@@ -571,7 +571,7 @@ async def get_ci_evidence_status(
Returns overview of recent evidence collected from CI/CD pipelines,
useful for dashboards and monitoring.
"""
cutoff_date = datetime.utcnow() - timedelta(days=days)
cutoff_date = datetime.now(timezone.utc) - timedelta(days=days)
# Build query
query = db.query(EvidenceDB).filter(

View File

@@ -18,7 +18,7 @@ import logging
import re
import asyncio
from typing import Optional, List, Dict
from datetime import datetime
from datetime import datetime, timezone
from fastapi import APIRouter, Depends
from pydantic import BaseModel
@@ -171,7 +171,7 @@ def _get_or_create_regulation(
code=regulation_code,
name=regulation_name or regulation_code,
regulation_type=reg_type,
description=f"Auto-created from RAG extraction ({datetime.utcnow().date()})",
description=f"Auto-created from RAG extraction ({datetime.now(timezone.utc).date()})",
)
return reg

View File

@@ -13,7 +13,7 @@ Provides endpoints for ISO 27001 certification-ready ISMS management:
import uuid
import hashlib
from datetime import datetime, date
from datetime import datetime, date, timezone
from typing import Optional
from fastapi import APIRouter, HTTPException, Query, Depends
@@ -102,7 +102,7 @@ def log_audit_trail(
new_value=new_value,
change_summary=change_summary,
performed_by=performed_by,
performed_at=datetime.utcnow(),
performed_at=datetime.now(timezone.utc),
checksum=create_signature(f"{entity_type}|{entity_id}|{action}|{performed_by}")
)
db.add(trail)
@@ -190,7 +190,7 @@ async def update_isms_scope(
setattr(scope, field, value)
scope.updated_by = updated_by
scope.updated_at = datetime.utcnow()
scope.updated_at = datetime.now(timezone.utc)
# Increment version if significant changes
version_parts = scope.version.split(".")
@@ -221,11 +221,11 @@ async def approve_isms_scope(
scope.status = ApprovalStatusEnum.APPROVED
scope.approved_by = data.approved_by
scope.approved_at = datetime.utcnow()
scope.approved_at = datetime.now(timezone.utc)
scope.effective_date = data.effective_date
scope.review_date = data.review_date
scope.approval_signature = create_signature(
f"{scope.scope_statement}|{data.approved_by}|{datetime.utcnow().isoformat()}"
f"{scope.scope_statement}|{data.approved_by}|{datetime.now(timezone.utc).isoformat()}"
)
log_audit_trail(db, "isms_scope", scope.id, "ISMS Scope", "approve", data.approved_by)
@@ -403,7 +403,7 @@ async def approve_policy(
policy.reviewed_by = data.reviewed_by
policy.approved_by = data.approved_by
policy.approved_at = datetime.utcnow()
policy.approved_at = datetime.now(timezone.utc)
policy.effective_date = data.effective_date
policy.next_review_date = date(
data.effective_date.year + (policy.review_frequency_months // 12),
@@ -412,7 +412,7 @@ async def approve_policy(
)
policy.status = ApprovalStatusEnum.APPROVED
policy.approval_signature = create_signature(
f"{policy.policy_id}|{data.approved_by}|{datetime.utcnow().isoformat()}"
f"{policy.policy_id}|{data.approved_by}|{datetime.now(timezone.utc).isoformat()}"
)
log_audit_trail(db, "isms_policy", policy.id, policy.policy_id, "approve", data.approved_by)
@@ -634,9 +634,9 @@ async def approve_soa_entry(
raise HTTPException(status_code=404, detail="SoA entry not found")
entry.reviewed_by = data.reviewed_by
entry.reviewed_at = datetime.utcnow()
entry.reviewed_at = datetime.now(timezone.utc)
entry.approved_by = data.approved_by
entry.approved_at = datetime.utcnow()
entry.approved_at = datetime.now(timezone.utc)
log_audit_trail(db, "soa", entry.id, entry.annex_a_control, "approve", data.approved_by)
db.commit()
@@ -812,7 +812,7 @@ async def close_finding(
finding.verification_method = data.verification_method
finding.verification_evidence = data.verification_evidence
finding.verified_by = data.closed_by
finding.verified_at = datetime.utcnow()
finding.verified_at = datetime.now(timezone.utc)
log_audit_trail(db, "audit_finding", finding.id, finding.finding_id, "close", data.closed_by)
db.commit()
@@ -1080,7 +1080,7 @@ async def approve_management_review(
review.status = "approved"
review.approved_by = data.approved_by
review.approved_at = datetime.utcnow()
review.approved_at = datetime.now(timezone.utc)
review.next_review_date = data.next_review_date
review.minutes_document_path = data.minutes_document_path
@@ -1392,7 +1392,7 @@ async def run_readiness_check(
# Save check result
check = ISMSReadinessCheckDB(
id=generate_id(),
check_date=datetime.utcnow(),
check_date=datetime.now(timezone.utc),
triggered_by=data.triggered_by,
overall_status=overall_status,
certification_possible=certification_possible,

View File

@@ -6,7 +6,7 @@ Extended with: Public endpoints, User Consents, Consent Audit Log, Cookie Catego
import uuid as uuid_mod
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query, Header, UploadFile, File
@@ -285,7 +285,7 @@ async def update_version(
for field, value in request.dict(exclude_none=True).items():
setattr(version, field, value)
version.updated_at = datetime.utcnow()
version.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(version)
@@ -346,7 +346,7 @@ def _transition(
)
version.status = to_status
version.updated_at = datetime.utcnow()
version.updated_at = datetime.now(timezone.utc)
if extra_updates:
for k, v in extra_updates.items():
setattr(version, k, v)
@@ -378,7 +378,7 @@ async def approve_version(
return _transition(
db, version_id, ['review'], 'approved', 'approved',
request.approver, request.comment,
extra_updates={'approved_by': request.approver, 'approved_at': datetime.utcnow()}
extra_updates={'approved_by': request.approver, 'approved_at': datetime.now(timezone.utc)}
)
@@ -728,7 +728,7 @@ async def withdraw_consent(
if consent.withdrawn_at:
raise HTTPException(status_code=400, detail="Consent already withdrawn")
consent.withdrawn_at = datetime.utcnow()
consent.withdrawn_at = datetime.now(timezone.utc)
consent.consented = False
_log_consent_audit(
@@ -903,7 +903,7 @@ async def update_cookie_category(
if val is not None:
setattr(cat, field, val)
cat.updated_at = datetime.utcnow()
cat.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(cat)
return _cookie_cat_to_dict(cat)

View File

@@ -15,7 +15,7 @@ Endpoints:
import json
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -322,7 +322,7 @@ async def update_legal_template(
params: Dict[str, Any] = {
"id": template_id,
"tenant_id": tenant_id,
"updated_at": datetime.utcnow(),
"updated_at": datetime.now(timezone.utc),
}
jsonb_fields = {"placeholders", "inspiration_sources"}

View File

@@ -13,7 +13,7 @@ Endpoints:
import json
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -253,7 +253,7 @@ async def update_loeschfrist(
):
"""Full update of a Loeschfrist policy."""
updates: Dict[str, Any] = {"id": policy_id, "tenant_id": tenant_id, "updated_at": datetime.utcnow()}
updates: Dict[str, Any] = {"id": policy_id, "tenant_id": tenant_id, "updated_at": datetime.now(timezone.utc)}
set_clauses = ["updated_at = :updated_at"]
for field, value in payload.model_dump(exclude_unset=True).items():
@@ -302,7 +302,7 @@ async def update_loeschfrist_status(
WHERE id = :id AND tenant_id = :tenant_id
RETURNING *
"""),
{"status": payload.status, "now": datetime.utcnow(), "id": policy_id, "tenant_id": tenant_id},
{"status": payload.status, "now": datetime.now(timezone.utc), "id": policy_id, "tenant_id": tenant_id},
).fetchone()
db.commit()

View File

@@ -21,7 +21,7 @@ Endpoints:
import json
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List, Any
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -852,11 +852,11 @@ async def update_incident(
# Auto-set timestamps based on status transitions
if updates.get("status") == "reported" and not updates.get("reported_to_authority_at"):
updates["reported_to_authority_at"] = datetime.utcnow().isoformat()
updates["reported_to_authority_at"] = datetime.now(timezone.utc).isoformat()
if updates.get("status") == "closed" and not updates.get("closed_at"):
updates["closed_at"] = datetime.utcnow().isoformat()
updates["closed_at"] = datetime.now(timezone.utc).isoformat()
updates["updated_at"] = datetime.utcnow().isoformat()
updates["updated_at"] = datetime.now(timezone.utc).isoformat()
set_parts = []
for k in updates:
@@ -984,7 +984,7 @@ async def update_template(
if not updates:
raise HTTPException(status_code=400, detail="No fields to update")
updates["updated_at"] = datetime.utcnow().isoformat()
updates["updated_at"] = datetime.now(timezone.utc).isoformat()
set_clauses = ", ".join(f"{k} = :{k}" for k in updates)
updates["id"] = template_id
updates["tenant_id"] = tenant_id

View File

@@ -12,7 +12,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, List, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query, Header
@@ -228,7 +228,7 @@ async def update_obligation(
logger.info("update_obligation user_id=%s tenant_id=%s id=%s", x_user_id, tenant_id, obligation_id)
import json
updates: Dict[str, Any] = {"id": obligation_id, "tenant_id": tenant_id, "updated_at": datetime.utcnow()}
updates: Dict[str, Any] = {"id": obligation_id, "tenant_id": tenant_id, "updated_at": datetime.now(timezone.utc)}
set_clauses = ["updated_at = :updated_at"]
for field, value in payload.model_dump(exclude_unset=True).items():
@@ -274,7 +274,7 @@ async def update_obligation_status(
SET status = :status, updated_at = :now
WHERE id = :id AND tenant_id = :tenant_id
RETURNING *
"""), {"status": payload.status, "now": datetime.utcnow(), "id": obligation_id, "tenant_id": tenant_id}).fetchone()
"""), {"status": payload.status, "now": datetime.now(timezone.utc), "id": obligation_id, "tenant_id": tenant_id}).fetchone()
db.commit()
if not row:

View File

@@ -10,7 +10,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -177,7 +177,7 @@ async def create_metric(
"threshold": payload.threshold,
"trend": payload.trend,
"ai_system": payload.ai_system,
"last_measured": payload.last_measured or datetime.utcnow(),
"last_measured": payload.last_measured or datetime.now(timezone.utc),
}).fetchone()
db.commit()
return _row_to_dict(row)
@@ -192,7 +192,7 @@ async def update_metric(
):
"""Update a quality metric."""
updates: Dict[str, Any] = {"id": metric_id, "tenant_id": tenant_id, "updated_at": datetime.utcnow()}
updates: Dict[str, Any] = {"id": metric_id, "tenant_id": tenant_id, "updated_at": datetime.now(timezone.utc)}
set_clauses = ["updated_at = :updated_at"]
for field, value in payload.model_dump(exclude_unset=True).items():
@@ -296,7 +296,7 @@ async def create_test(
"duration": payload.duration,
"ai_system": payload.ai_system,
"details": payload.details,
"last_run": payload.last_run or datetime.utcnow(),
"last_run": payload.last_run or datetime.now(timezone.utc),
}).fetchone()
db.commit()
return _row_to_dict(row)
@@ -311,7 +311,7 @@ async def update_test(
):
"""Update a quality test."""
updates: Dict[str, Any] = {"id": test_id, "tenant_id": tenant_id, "updated_at": datetime.utcnow()}
updates: Dict[str, Any] = {"id": test_id, "tenant_id": tenant_id, "updated_at": datetime.now(timezone.utc)}
set_clauses = ["updated_at = :updated_at"]
for field, value in payload.model_dump(exclude_unset=True).items():

View File

@@ -16,7 +16,7 @@ import logging
logger = logging.getLogger(__name__)
import os
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Query, BackgroundTasks
@@ -393,11 +393,11 @@ async def update_requirement(requirement_id: str, updates: dict, db: Session = D
# Track audit changes
if 'audit_status' in updates:
requirement.last_audit_date = datetime.utcnow()
requirement.last_audit_date = datetime.now(timezone.utc)
# TODO: Get auditor from auth
requirement.last_auditor = updates.get('auditor_name', 'api_user')
requirement.updated_at = datetime.utcnow()
requirement.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(requirement)

View File

@@ -10,7 +10,7 @@ Endpoints:
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional, Any, Dict
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -207,7 +207,7 @@ async def update_security_item(
):
"""Update a security backlog item."""
updates: Dict[str, Any] = {"id": item_id, "tenant_id": tenant_id, "updated_at": datetime.utcnow()}
updates: Dict[str, Any] = {"id": item_id, "tenant_id": tenant_id, "updated_at": datetime.now(timezone.utc)}
set_clauses = ["updated_at = :updated_at"]
for field, value in payload.model_dump(exclude_unset=True).items():

View File

@@ -21,11 +21,11 @@ Endpoints:
GET /api/v1/admin/compliance-report — Compliance report
"""
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional
from fastapi import APIRouter, HTTPException, Depends, Query
from pydantic import BaseModel, Field
from pydantic import BaseModel, ConfigDict, Field
from sqlalchemy.orm import Session
from database import get_db
@@ -83,8 +83,7 @@ class SourceResponse(BaseModel):
created_at: str
updated_at: Optional[str] = None
class Config:
from_attributes = True
model_config = ConfigDict(from_attributes=True)
class OperationUpdate(BaseModel):
@@ -530,7 +529,7 @@ async def get_policy_stats(db: Session = Depends(get_db)):
pii_rules = db.query(PIIRuleDB).filter(PIIRuleDB.active).count()
# Count blocked content entries from today
today_start = datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
today_start = datetime.now(timezone.utc).replace(hour=0, minute=0, second=0, microsecond=0)
blocked_today = db.query(BlockedContentDB).filter(
BlockedContentDB.created_at >= today_start,
).count()
@@ -553,7 +552,7 @@ async def get_compliance_report(db: Session = Depends(get_db)):
pii_rules = db.query(PIIRuleDB).filter(PIIRuleDB.active).all()
return {
"report_date": datetime.utcnow().isoformat(),
"report_date": datetime.now(timezone.utc).isoformat(),
"summary": {
"active_sources": len(sources),
"active_pii_rules": len(pii_rules),

View File

@@ -49,7 +49,7 @@ vendor_findings, vendor_control_instances).
import json
import logging
import uuid
from datetime import datetime
from datetime import datetime, timezone
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Query
@@ -69,7 +69,7 @@ DEFAULT_TENANT_ID = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
# =============================================================================
def _now_iso() -> str:
return datetime.utcnow().isoformat() + "Z"
return datetime.now(timezone.utc).isoformat() + "Z"
def _ok(data, status_code: int = 200):
@@ -418,7 +418,7 @@ def create_vendor(body: dict = {}, db: Session = Depends(get_db)):
data = _to_snake(body)
vid = str(uuid.uuid4())
tid = data.get("tenant_id", DEFAULT_TENANT_ID)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
db.execute(text("""
INSERT INTO vendor_vendors (
@@ -498,7 +498,7 @@ def update_vendor(vendor_id: str, body: dict = {}, db: Session = Depends(get_db)
raise HTTPException(404, "Vendor not found")
data = _to_snake(body)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
# Build dynamic SET clause
allowed = [
@@ -558,7 +558,7 @@ def patch_vendor_status(vendor_id: str, body: dict = {}, db: Session = Depends(g
result = db.execute(text("""
UPDATE vendor_vendors SET status = :status, updated_at = :now WHERE id = :id
"""), {"id": vendor_id, "status": new_status, "now": datetime.utcnow().isoformat()})
"""), {"id": vendor_id, "status": new_status, "now": datetime.now(timezone.utc).isoformat()})
db.commit()
if result.rowcount == 0:
raise HTTPException(404, "Vendor not found")
@@ -620,7 +620,7 @@ def create_contract(body: dict = {}, db: Session = Depends(get_db)):
data = _to_snake(body)
cid = str(uuid.uuid4())
tid = data.get("tenant_id", DEFAULT_TENANT_ID)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
db.execute(text("""
INSERT INTO vendor_contracts (
@@ -682,7 +682,7 @@ def update_contract(contract_id: str, body: dict = {}, db: Session = Depends(get
raise HTTPException(404, "Contract not found")
data = _to_snake(body)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
allowed = [
"vendor_id", "file_name", "original_name", "mime_type", "file_size",
@@ -781,7 +781,7 @@ def create_finding(body: dict = {}, db: Session = Depends(get_db)):
data = _to_snake(body)
fid = str(uuid.uuid4())
tid = data.get("tenant_id", DEFAULT_TENANT_ID)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
db.execute(text("""
INSERT INTO vendor_findings (
@@ -831,7 +831,7 @@ def update_finding(finding_id: str, body: dict = {}, db: Session = Depends(get_d
raise HTTPException(404, "Finding not found")
data = _to_snake(body)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
allowed = [
"vendor_id", "contract_id", "finding_type", "category", "severity",
@@ -920,7 +920,7 @@ def create_control_instance(body: dict = {}, db: Session = Depends(get_db)):
data = _to_snake(body)
ciid = str(uuid.uuid4())
tid = data.get("tenant_id", DEFAULT_TENANT_ID)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
db.execute(text("""
INSERT INTO vendor_control_instances (
@@ -965,7 +965,7 @@ def update_control_instance(instance_id: str, body: dict = {}, db: Session = Dep
raise HTTPException(404, "Control instance not found")
data = _to_snake(body)
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
allowed = [
"vendor_id", "control_id", "control_domain",
@@ -1050,7 +1050,7 @@ def list_controls(
def create_control(body: dict = {}, db: Session = Depends(get_db)):
cid = str(uuid.uuid4())
tid = body.get("tenantId", body.get("tenant_id", DEFAULT_TENANT_ID))
now = datetime.utcnow().isoformat()
now = datetime.now(timezone.utc).isoformat()
db.execute(text("""
INSERT INTO vendor_compliance_controls (

View File

@@ -119,7 +119,7 @@ async def upsert_organization(
else:
for field, value in request.dict(exclude_none=True).items():
setattr(org, field, value)
org.updated_at = datetime.utcnow()
org.updated_at = datetime.now(timezone.utc)
db.commit()
db.refresh(org)
@@ -291,7 +291,7 @@ async def update_activity(
updates = request.dict(exclude_none=True)
for field, value in updates.items():
setattr(act, field, value)
act.updated_at = datetime.utcnow()
act.updated_at = datetime.now(timezone.utc)
_log_audit(
db,
@@ -408,7 +408,7 @@ async def export_activities(
return _export_csv(activities)
return {
"exported_at": datetime.utcnow().isoformat(),
"exported_at": datetime.now(timezone.utc).isoformat(),
"organization": {
"name": org.organization_name if org else "",
"dpo_name": org.dpo_name if org else "",
@@ -482,7 +482,7 @@ def _export_csv(activities: list) -> StreamingResponse:
iter([output.getvalue()]),
media_type='text/csv; charset=utf-8',
headers={
'Content-Disposition': f'attachment; filename="vvt_export_{datetime.utcnow().strftime("%Y%m%d")}.csv"'
'Content-Disposition': f'attachment; filename="vvt_export_{datetime.now(timezone.utc).strftime("%Y%m%d")}.csv"'
},
)

View File

@@ -0,0 +1,141 @@
"""
AI System & Audit Export models — extracted from compliance/db/models.py.
Covers AI Act system registration/classification and the audit export package
tracker. Re-exported from ``compliance.db.models`` for backwards compatibility.
DO NOT change __tablename__, column names, or relationship strings.
"""
import uuid
import enum
from datetime import datetime, timezone
from sqlalchemy import (
Column, String, Text, Integer, DateTime, Date,
Enum, JSON, Index, Float,
)
from classroom_engine.database import Base
# ============================================================================
# ENUMS
# ============================================================================
class AIClassificationEnum(str, enum.Enum):
"""AI Act risk classification."""
PROHIBITED = "prohibited"
HIGH_RISK = "high-risk"
LIMITED_RISK = "limited-risk"
MINIMAL_RISK = "minimal-risk"
UNCLASSIFIED = "unclassified"
class AISystemStatusEnum(str, enum.Enum):
"""Status of an AI system in compliance tracking."""
DRAFT = "draft"
CLASSIFIED = "classified"
COMPLIANT = "compliant"
NON_COMPLIANT = "non-compliant"
class ExportStatusEnum(str, enum.Enum):
"""Status of audit export."""
PENDING = "pending"
GENERATING = "generating"
COMPLETED = "completed"
FAILED = "failed"
# ============================================================================
# MODELS
# ============================================================================
class AISystemDB(Base):
"""
AI System registry for AI Act compliance.
Tracks AI systems, their risk classification, and compliance status.
"""
__tablename__ = 'compliance_ai_systems'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
name = Column(String(300), nullable=False)
description = Column(Text)
purpose = Column(String(500))
sector = Column(String(100))
# AI Act classification
classification = Column(Enum(AIClassificationEnum), default=AIClassificationEnum.UNCLASSIFIED)
status = Column(Enum(AISystemStatusEnum), default=AISystemStatusEnum.DRAFT)
# Assessment
assessment_date = Column(DateTime)
assessment_result = Column(JSON) # Full assessment result
obligations = Column(JSON) # List of AI Act obligations
risk_factors = Column(JSON) # Risk factors from assessment
recommendations = Column(JSON) # Recommendations from assessment
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_ai_system_classification', 'classification'),
Index('ix_ai_system_status', 'status'),
)
def __repr__(self):
return f"<AISystem {self.name} ({self.classification.value})>"
class AuditExportDB(Base):
"""
Tracks audit export packages generated for external auditors.
"""
__tablename__ = 'compliance_audit_exports'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
export_type = Column(String(50), nullable=False) # "full", "controls_only", "evidence_only"
export_name = Column(String(200)) # User-friendly name
# Scope
included_regulations = Column(JSON) # List of regulation codes
included_domains = Column(JSON) # List of control domains
date_range_start = Column(Date)
date_range_end = Column(Date)
# Generation
requested_by = Column(String(100), nullable=False)
requested_at = Column(DateTime, nullable=False, default=lambda: datetime.now(timezone.utc))
completed_at = Column(DateTime)
# Output
file_path = Column(String(500))
file_hash = Column(String(64)) # SHA-256 of ZIP
file_size_bytes = Column(Integer)
status = Column(Enum(ExportStatusEnum), default=ExportStatusEnum.PENDING)
error_message = Column(Text)
# Statistics
total_controls = Column(Integer)
total_evidence = Column(Integer)
compliance_score = Column(Float)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
def __repr__(self):
return f"<AuditExport {self.export_type} by {self.requested_by}>"
__all__ = [
"AIClassificationEnum",
"AISystemStatusEnum",
"ExportStatusEnum",
"AISystemDB",
"AuditExportDB",
]

View File

@@ -0,0 +1,177 @@
"""
Audit Session & Sign-Off models — Sprint 3 Phase 3.
Extracted from compliance/db/models.py as the first worked example of the
Phase 1 model split. The classes are re-exported from compliance.db.models
for backwards compatibility, so existing imports continue to work unchanged.
Tables:
- compliance_audit_sessions: Structured compliance audit sessions
- compliance_audit_signoffs: Per-requirement sign-offs with digital signatures
DO NOT change __tablename__, column names, or relationship strings — the
database schema is frozen.
"""
import uuid
import enum
from datetime import datetime, timezone
from sqlalchemy import (
Column, String, Text, Integer, DateTime,
ForeignKey, Enum, JSON, Index,
)
from sqlalchemy.orm import relationship
from classroom_engine.database import Base
# ============================================================================
# ENUMS
# ============================================================================
class AuditResultEnum(str, enum.Enum):
"""Result of an audit sign-off for a requirement."""
COMPLIANT = "compliant" # Fully compliant
COMPLIANT_WITH_NOTES = "compliant_notes" # Compliant with observations
NON_COMPLIANT = "non_compliant" # Not compliant - remediation required
NOT_APPLICABLE = "not_applicable" # Not applicable to this audit
PENDING = "pending" # Not yet reviewed
class AuditSessionStatusEnum(str, enum.Enum):
"""Status of an audit session."""
DRAFT = "draft" # Session created, not started
IN_PROGRESS = "in_progress" # Audit in progress
COMPLETED = "completed" # All items reviewed
ARCHIVED = "archived" # Historical record
# ============================================================================
# MODELS
# ============================================================================
class AuditSessionDB(Base):
"""
Audit session for structured compliance reviews.
Enables auditors to:
- Create named audit sessions (e.g., "Q1 2026 GDPR Audit")
- Track progress through requirements
- Sign off individual items with digital signatures
- Generate audit reports
"""
__tablename__ = 'compliance_audit_sessions'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
name = Column(String(200), nullable=False) # e.g., "Q1 2026 Compliance Audit"
description = Column(Text)
# Auditor information
auditor_name = Column(String(100), nullable=False) # e.g., "Dr. Thomas Müller"
auditor_email = Column(String(200))
auditor_organization = Column(String(200)) # External auditor company
# Session scope
status = Column(Enum(AuditSessionStatusEnum), default=AuditSessionStatusEnum.DRAFT)
regulation_ids = Column(JSON) # Filter: ["GDPR", "AIACT"] or null for all
# Progress tracking
total_items = Column(Integer, default=0)
completed_items = Column(Integer, default=0)
compliant_count = Column(Integer, default=0)
non_compliant_count = Column(Integer, default=0)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
started_at = Column(DateTime) # When audit began
completed_at = Column(DateTime) # When audit finished
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
signoffs = relationship("AuditSignOffDB", back_populates="session", cascade="all, delete-orphan")
__table_args__ = (
Index('ix_audit_session_status', 'status'),
Index('ix_audit_session_auditor', 'auditor_name'),
)
def __repr__(self):
return f"<AuditSession {self.name} ({self.status.value})>"
@property
def completion_percentage(self) -> float:
"""Calculate completion percentage."""
if self.total_items == 0:
return 0.0
return round((self.completed_items / self.total_items) * 100, 1)
class AuditSignOffDB(Base):
"""
Individual sign-off for a requirement within an audit session.
Features:
- Records audit result (compliant, non-compliant, etc.)
- Stores auditor notes and observations
- Creates digital signature (SHA-256 hash) for tamper evidence
"""
__tablename__ = 'compliance_audit_signoffs'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
session_id = Column(String(36), ForeignKey('compliance_audit_sessions.id'), nullable=False, index=True)
requirement_id = Column(String(36), ForeignKey('compliance_requirements.id'), nullable=False, index=True)
# Audit result
result = Column(Enum(AuditResultEnum), default=AuditResultEnum.PENDING)
notes = Column(Text) # Auditor observations
# Evidence references for this sign-off
evidence_ids = Column(JSON) # List of evidence IDs reviewed
# Digital signature (SHA-256 hash of result + auditor + timestamp)
signature_hash = Column(String(64)) # SHA-256 hex string
signed_at = Column(DateTime)
signed_by = Column(String(100)) # Auditor name at time of signing
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
session = relationship("AuditSessionDB", back_populates="signoffs")
requirement = relationship("RequirementDB")
__table_args__ = (
Index('ix_signoff_session_requirement', 'session_id', 'requirement_id', unique=True),
Index('ix_signoff_result', 'result'),
)
def __repr__(self):
return f"<AuditSignOff {self.requirement_id}: {self.result.value}>"
def create_signature(self, auditor_name: str) -> str:
"""
Create a digital signature for this sign-off.
Returns SHA-256 hash of: result + requirement_id + auditor_name + timestamp
"""
import hashlib
timestamp = datetime.now(timezone.utc).isoformat()
data = f"{self.result.value}|{self.requirement_id}|{auditor_name}|{timestamp}"
signature = hashlib.sha256(data.encode()).hexdigest()
self.signature_hash = signature
self.signed_at = datetime.now(timezone.utc)
self.signed_by = auditor_name
return signature
__all__ = [
"AuditResultEnum",
"AuditSessionStatusEnum",
"AuditSessionDB",
"AuditSignOffDB",
]

View File

@@ -0,0 +1,279 @@
"""
Control, Evidence, and Risk models — extracted from compliance/db/models.py.
Covers the control framework (ControlDB), requirement↔control mappings,
evidence artifacts, and the risk register. Re-exported from
``compliance.db.models`` for backwards compatibility.
DO NOT change __tablename__, column names, or relationship strings.
"""
import uuid
import enum
from datetime import datetime, date, timezone
from sqlalchemy import (
Column, String, Text, Integer, Boolean, DateTime, Date,
ForeignKey, Enum, JSON, Index,
)
from sqlalchemy.orm import relationship
from classroom_engine.database import Base
# ============================================================================
# ENUMS
# ============================================================================
class ControlTypeEnum(str, enum.Enum):
"""Type of security control."""
PREVENTIVE = "preventive" # Prevents incidents
DETECTIVE = "detective" # Detects incidents
CORRECTIVE = "corrective" # Corrects after incidents
class ControlDomainEnum(str, enum.Enum):
"""Domain/category of control."""
GOVERNANCE = "gov" # Governance & Organization
PRIVACY = "priv" # Privacy & Data Protection
IAM = "iam" # Identity & Access Management
CRYPTO = "crypto" # Cryptography & Key Management
SDLC = "sdlc" # Secure Development Lifecycle
OPS = "ops" # Operations & Monitoring
AI = "ai" # AI-specific controls
CRA = "cra" # CRA & Supply Chain
AUDIT = "aud" # Audit & Traceability
class ControlStatusEnum(str, enum.Enum):
"""Implementation status of a control."""
PASS = "pass" # Fully implemented & passing
PARTIAL = "partial" # Partially implemented
FAIL = "fail" # Not passing
NOT_APPLICABLE = "n/a" # Not applicable
PLANNED = "planned" # Planned for implementation
class RiskLevelEnum(str, enum.Enum):
"""Risk severity level."""
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
CRITICAL = "critical"
class EvidenceStatusEnum(str, enum.Enum):
"""Status of evidence artifact."""
VALID = "valid" # Currently valid
EXPIRED = "expired" # Past validity date
PENDING = "pending" # Awaiting validation
FAILED = "failed" # Failed validation
# ============================================================================
# MODELS
# ============================================================================
class ControlDB(Base):
"""
Technical or organizational security control.
Examples: PRIV-001 (Verarbeitungsverzeichnis), SDLC-001 (SAST Scanning)
"""
__tablename__ = 'compliance_controls'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
control_id = Column(String(20), unique=True, nullable=False, index=True) # e.g., "PRIV-001"
domain = Column(Enum(ControlDomainEnum), nullable=False, index=True)
control_type = Column(Enum(ControlTypeEnum), nullable=False)
title = Column(String(300), nullable=False)
description = Column(Text)
pass_criteria = Column(Text, nullable=False) # Measurable pass criteria
implementation_guidance = Column(Text) # How to implement
# Code/Evidence references
code_reference = Column(String(500)) # e.g., "backend/middleware/pii_redactor.py:45"
documentation_url = Column(String(500)) # Link to internal docs
# Automation
is_automated = Column(Boolean, default=False)
automation_tool = Column(String(100)) # e.g., "Semgrep", "Trivy"
automation_config = Column(JSON) # Tool-specific config
# Status
status = Column(Enum(ControlStatusEnum), default=ControlStatusEnum.PLANNED)
status_notes = Column(Text)
# Ownership & Review
owner = Column(String(100)) # Responsible person/team
review_frequency_days = Column(Integer, default=90)
last_reviewed_at = Column(DateTime)
next_review_at = Column(DateTime)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
mappings = relationship("ControlMappingDB", back_populates="control", cascade="all, delete-orphan")
evidence = relationship("EvidenceDB", back_populates="control", cascade="all, delete-orphan")
__table_args__ = (
Index('ix_control_domain_status', 'domain', 'status'),
)
def __repr__(self):
return f"<Control {self.control_id}: {self.title}>"
class ControlMappingDB(Base):
"""
Maps requirements to controls (many-to-many with metadata).
"""
__tablename__ = 'compliance_control_mappings'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
requirement_id = Column(String(36), ForeignKey('compliance_requirements.id'), nullable=False, index=True)
control_id = Column(String(36), ForeignKey('compliance_controls.id'), nullable=False, index=True)
coverage_level = Column(String(20), default="full") # "full", "partial", "planned"
notes = Column(Text) # Explanation of coverage
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
requirement = relationship("RequirementDB", back_populates="control_mappings")
control = relationship("ControlDB", back_populates="mappings")
__table_args__ = (
Index('ix_mapping_req_ctrl', 'requirement_id', 'control_id', unique=True),
)
class EvidenceDB(Base):
"""
Audit evidence for controls.
Types: scan_report, policy_document, config_snapshot, test_result,
manual_upload, screenshot, external_link
"""
__tablename__ = 'compliance_evidence'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
control_id = Column(String(36), ForeignKey('compliance_controls.id'), nullable=False, index=True)
evidence_type = Column(String(50), nullable=False) # Type of evidence
title = Column(String(300), nullable=False)
description = Column(Text)
# File/Link storage
artifact_path = Column(String(500)) # Local file path
artifact_url = Column(String(500)) # External URL
artifact_hash = Column(String(64)) # SHA-256 hash
file_size_bytes = Column(Integer)
mime_type = Column(String(100))
# Validity period
valid_from = Column(DateTime, nullable=False, default=lambda: datetime.now(timezone.utc))
valid_until = Column(DateTime) # NULL = no expiry
status = Column(Enum(EvidenceStatusEnum), default=EvidenceStatusEnum.VALID)
# Source tracking
source = Column(String(100)) # "ci_pipeline", "manual", "api"
ci_job_id = Column(String(100)) # CI/CD job reference
uploaded_by = Column(String(100)) # User who uploaded
# Timestamps
collected_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
control = relationship("ControlDB", back_populates="evidence")
__table_args__ = (
Index('ix_evidence_control_type', 'control_id', 'evidence_type'),
Index('ix_evidence_status', 'status'),
)
def __repr__(self):
return f"<Evidence {self.evidence_type}: {self.title}>"
class RiskDB(Base):
"""
Risk register entry with likelihood x impact scoring.
"""
__tablename__ = 'compliance_risks'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
risk_id = Column(String(20), unique=True, nullable=False, index=True) # e.g., "RISK-001"
title = Column(String(300), nullable=False)
description = Column(Text)
category = Column(String(50), nullable=False) # "data_breach", "compliance_gap", etc.
# Inherent risk (before controls)
likelihood = Column(Integer, nullable=False) # 1-5
impact = Column(Integer, nullable=False) # 1-5
inherent_risk = Column(Enum(RiskLevelEnum), nullable=False)
# Mitigating controls
mitigating_controls = Column(JSON) # List of control_ids
# Residual risk (after controls)
residual_likelihood = Column(Integer)
residual_impact = Column(Integer)
residual_risk = Column(Enum(RiskLevelEnum))
# Management
owner = Column(String(100))
status = Column(String(20), default="open") # "open", "mitigated", "accepted", "transferred"
treatment_plan = Column(Text)
# Review
identified_date = Column(Date, default=date.today)
review_date = Column(Date)
last_assessed_at = Column(DateTime)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_risk_category_status', 'category', 'status'),
Index('ix_risk_inherent', 'inherent_risk'),
)
def __repr__(self):
return f"<Risk {self.risk_id}: {self.title}>"
@staticmethod
def calculate_risk_level(likelihood: int, impact: int) -> RiskLevelEnum:
"""Calculate risk level from likelihood x impact matrix."""
score = likelihood * impact
if score >= 20:
return RiskLevelEnum.CRITICAL
elif score >= 12:
return RiskLevelEnum.HIGH
elif score >= 6:
return RiskLevelEnum.MEDIUM
else:
return RiskLevelEnum.LOW
__all__ = [
"ControlTypeEnum",
"ControlDomainEnum",
"ControlStatusEnum",
"RiskLevelEnum",
"EvidenceStatusEnum",
"ControlDB",
"ControlMappingDB",
"EvidenceDB",
"RiskDB",
]

View File

@@ -0,0 +1,468 @@
"""
ISMS Audit Execution models (ISO 27001 Kapitel 9-10) — extracted from
compliance/db/models.py.
Covers findings, corrective actions (CAPA), management reviews, internal
audits, audit trail, and readiness checks. The governance side (scope,
context, policies, objectives, SoA) lives in ``isms_governance_models.py``.
Re-exported from ``compliance.db.models`` for backwards compatibility.
DO NOT change __tablename__, column names, or relationship strings.
"""
import uuid
import enum
from datetime import datetime, date, timezone
from sqlalchemy import (
Column, String, Text, Integer, Boolean, DateTime, Date,
ForeignKey, Enum, JSON, Index, Float,
)
from sqlalchemy.orm import relationship
from classroom_engine.database import Base
# ============================================================================
# ENUMS
# ============================================================================
class FindingTypeEnum(str, enum.Enum):
"""ISO 27001 audit finding classification."""
MAJOR = "major" # Major nonconformity - blocks certification
MINOR = "minor" # Minor nonconformity - requires CAPA
OFI = "ofi" # Opportunity for Improvement
POSITIVE = "positive" # Positive observation
class FindingStatusEnum(str, enum.Enum):
"""Status of an audit finding."""
OPEN = "open"
IN_PROGRESS = "in_progress"
CORRECTIVE_ACTION_PENDING = "capa_pending"
VERIFICATION_PENDING = "verification_pending"
VERIFIED = "verified"
CLOSED = "closed"
class CAPATypeEnum(str, enum.Enum):
"""Type of corrective/preventive action."""
CORRECTIVE = "corrective" # Fix the nonconformity
PREVENTIVE = "preventive" # Prevent recurrence
BOTH = "both"
# ============================================================================
# MODELS
# ============================================================================
class AuditFindingDB(Base):
"""
Audit Finding with ISO 27001 Classification (Major/Minor/OFI)
Tracks findings from internal and external audits with proper
classification and CAPA workflow.
"""
__tablename__ = 'compliance_audit_findings'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
finding_id = Column(String(30), unique=True, nullable=False, index=True) # e.g., "FIND-2026-001"
# Source
audit_session_id = Column(String(36), ForeignKey('compliance_audit_sessions.id'), index=True)
internal_audit_id = Column(String(36), ForeignKey('compliance_internal_audits.id'), index=True)
# Classification (CRITICAL for ISO 27001!)
finding_type = Column(Enum(FindingTypeEnum), nullable=False)
# ISO reference
iso_chapter = Column(String(20)) # e.g., "6.1.2", "9.2"
annex_a_control = Column(String(20)) # e.g., "A.8.2"
# Finding details
title = Column(String(300), nullable=False)
description = Column(Text, nullable=False)
objective_evidence = Column(Text, nullable=False) # What the auditor observed
# Root cause analysis
root_cause = Column(Text)
root_cause_method = Column(String(50)) # "5-why", "fishbone", "pareto"
# Impact assessment
impact_description = Column(Text)
affected_processes = Column(JSON)
affected_assets = Column(JSON)
# Status tracking
status = Column(Enum(FindingStatusEnum), default=FindingStatusEnum.OPEN)
# Responsibility
owner = Column(String(100)) # Person responsible for closure
auditor = Column(String(100)) # Auditor who raised finding
# Dates
identified_date = Column(Date, nullable=False, default=date.today)
due_date = Column(Date) # Deadline for closure
closed_date = Column(Date)
# Verification
verification_method = Column(Text)
verified_by = Column(String(100))
verified_at = Column(DateTime)
verification_evidence = Column(Text)
# Closure
closure_notes = Column(Text)
closed_by = Column(String(100))
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
corrective_actions = relationship("CorrectiveActionDB", back_populates="finding", cascade="all, delete-orphan")
__table_args__ = (
Index('ix_finding_type_status', 'finding_type', 'status'),
Index('ix_finding_due_date', 'due_date'),
)
def __repr__(self):
return f"<AuditFinding {self.finding_id}: {self.finding_type.value}>"
@property
def is_blocking(self) -> bool:
"""Major findings block certification."""
return self.finding_type == FindingTypeEnum.MAJOR and self.status != FindingStatusEnum.CLOSED
class CorrectiveActionDB(Base):
"""
Corrective & Preventive Actions (CAPA) - ISO 27001 10.1
Tracks actions taken to address nonconformities.
"""
__tablename__ = 'compliance_corrective_actions'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
capa_id = Column(String(30), unique=True, nullable=False, index=True) # e.g., "CAPA-2026-001"
# Link to finding
finding_id = Column(String(36), ForeignKey('compliance_audit_findings.id'), nullable=False, index=True)
# Type
capa_type = Column(Enum(CAPATypeEnum), nullable=False)
# Action details
title = Column(String(300), nullable=False)
description = Column(Text, nullable=False)
expected_outcome = Column(Text)
# Responsibility
assigned_to = Column(String(100), nullable=False)
approved_by = Column(String(100))
# Timeline
planned_start = Column(Date)
planned_completion = Column(Date, nullable=False)
actual_completion = Column(Date)
# Status
status = Column(String(30), default="planned") # planned, in_progress, completed, verified, cancelled
progress_percentage = Column(Integer, default=0)
# Resources
estimated_effort_hours = Column(Integer)
actual_effort_hours = Column(Integer)
resources_required = Column(Text)
# Evidence of implementation
implementation_evidence = Column(Text)
evidence_ids = Column(JSON)
# Effectiveness review
effectiveness_criteria = Column(Text)
effectiveness_verified = Column(Boolean, default=False)
effectiveness_verification_date = Column(Date)
effectiveness_notes = Column(Text)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
finding = relationship("AuditFindingDB", back_populates="corrective_actions")
__table_args__ = (
Index('ix_capa_status', 'status'),
Index('ix_capa_due', 'planned_completion'),
)
def __repr__(self):
return f"<CAPA {self.capa_id}: {self.capa_type.value}>"
class ManagementReviewDB(Base):
"""
Management Review (ISO 27001 Kapitel 9.3)
Records mandatory management reviews of the ISMS.
"""
__tablename__ = 'compliance_management_reviews'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
review_id = Column(String(30), unique=True, nullable=False, index=True) # e.g., "MR-2026-Q1"
# Review details
title = Column(String(200), nullable=False)
review_date = Column(Date, nullable=False)
review_period_start = Column(Date) # Period being reviewed
review_period_end = Column(Date)
# Participants
chairperson = Column(String(100), nullable=False) # Usually top management
attendees = Column(JSON) # List of {"name": "", "role": ""}
# 9.3 Review Inputs (mandatory!)
input_previous_actions = Column(Text) # Status of previous review actions
input_isms_changes = Column(Text) # Changes in internal/external issues
input_security_performance = Column(Text) # Nonconformities, monitoring, audit results
input_interested_party_feedback = Column(Text)
input_risk_assessment_results = Column(Text)
input_improvement_opportunities = Column(Text)
# Additional inputs
input_policy_effectiveness = Column(Text)
input_objective_achievement = Column(Text)
input_resource_adequacy = Column(Text)
# 9.3 Review Outputs (mandatory!)
output_improvement_decisions = Column(Text) # Decisions for improvement
output_isms_changes = Column(Text) # Changes needed to ISMS
output_resource_needs = Column(Text) # Resource requirements
# Action items
action_items = Column(JSON) # List of {"action": "", "owner": "", "due_date": ""}
# Overall assessment
isms_effectiveness_rating = Column(String(20)) # "effective", "partially_effective", "not_effective"
key_decisions = Column(Text)
# Approval
status = Column(String(30), default="draft") # draft, conducted, approved
approved_by = Column(String(100))
approved_at = Column(DateTime)
minutes_document_path = Column(String(500)) # Link to meeting minutes
# Next review
next_review_date = Column(Date)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_mgmt_review_date', 'review_date'),
Index('ix_mgmt_review_status', 'status'),
)
def __repr__(self):
return f"<ManagementReview {self.review_id}: {self.review_date}>"
class InternalAuditDB(Base):
"""
Internal Audit (ISO 27001 Kapitel 9.2)
Tracks internal audit program and individual audits.
"""
__tablename__ = 'compliance_internal_audits'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
audit_id = Column(String(30), unique=True, nullable=False, index=True) # e.g., "IA-2026-001"
# Audit details
title = Column(String(200), nullable=False)
audit_type = Column(String(50), nullable=False) # "scheduled", "surveillance", "special"
# Scope
scope_description = Column(Text, nullable=False)
iso_chapters_covered = Column(JSON) # e.g., ["4", "5", "6.1"]
annex_a_controls_covered = Column(JSON) # e.g., ["A.5", "A.6"]
processes_covered = Column(JSON)
departments_covered = Column(JSON)
# Audit criteria
criteria = Column(Text) # Standards, policies being audited against
# Timeline
planned_date = Column(Date, nullable=False)
actual_start_date = Column(Date)
actual_end_date = Column(Date)
# Audit team
lead_auditor = Column(String(100), nullable=False)
audit_team = Column(JSON) # List of auditor names
auditee_representatives = Column(JSON) # Who was interviewed
# Status
status = Column(String(30), default="planned") # planned, in_progress, completed, cancelled
# Results summary
total_findings = Column(Integer, default=0)
major_findings = Column(Integer, default=0)
minor_findings = Column(Integer, default=0)
ofi_count = Column(Integer, default=0)
positive_observations = Column(Integer, default=0)
# Conclusion
audit_conclusion = Column(Text)
overall_assessment = Column(String(30)) # "conforming", "minor_nc", "major_nc"
# Report
report_date = Column(Date)
report_document_path = Column(String(500))
# Sign-off
report_approved_by = Column(String(100))
report_approved_at = Column(DateTime)
# Follow-up
follow_up_audit_required = Column(Boolean, default=False)
follow_up_audit_id = Column(String(36))
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
findings = relationship("AuditFindingDB", backref="internal_audit", foreign_keys=[AuditFindingDB.internal_audit_id])
__table_args__ = (
Index('ix_internal_audit_date', 'planned_date'),
Index('ix_internal_audit_status', 'status'),
)
def __repr__(self):
return f"<InternalAudit {self.audit_id}: {self.title}>"
class AuditTrailDB(Base):
"""
Comprehensive Audit Trail for ISMS Changes
Tracks all changes to compliance-relevant data for
accountability and forensic analysis.
"""
__tablename__ = 'compliance_audit_trail'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
# What changed
entity_type = Column(String(50), nullable=False, index=True) # "control", "risk", "policy", etc.
entity_id = Column(String(36), nullable=False, index=True)
entity_name = Column(String(200)) # Human-readable identifier
# Action
action = Column(String(20), nullable=False) # "create", "update", "delete", "approve", "sign"
# Change details
field_changed = Column(String(100)) # Which field (for updates)
old_value = Column(Text)
new_value = Column(Text)
change_summary = Column(Text) # Human-readable summary
# Who & When
performed_by = Column(String(100), nullable=False)
performed_at = Column(DateTime, nullable=False, default=lambda: datetime.now(timezone.utc))
# Context
ip_address = Column(String(45))
user_agent = Column(String(500))
session_id = Column(String(100))
# Integrity
checksum = Column(String(64)) # SHA-256 of the change
# Timestamps (immutable after creation)
created_at = Column(DateTime, nullable=False, default=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_audit_trail_entity', 'entity_type', 'entity_id'),
Index('ix_audit_trail_time', 'performed_at'),
Index('ix_audit_trail_user', 'performed_by'),
)
def __repr__(self):
return f"<AuditTrail {self.action} on {self.entity_type}/{self.entity_id}>"
class ISMSReadinessCheckDB(Base):
"""
ISMS Readiness Check Results
Stores automated pre-audit checks to identify potential
Major findings before external audit.
"""
__tablename__ = 'compliance_isms_readiness'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
# Check run
check_date = Column(DateTime, nullable=False, default=lambda: datetime.now(timezone.utc))
triggered_by = Column(String(100)) # "scheduled", "manual", "pre-audit"
# Overall status
overall_status = Column(String(20), nullable=False) # "ready", "at_risk", "not_ready"
certification_possible = Column(Boolean, nullable=False)
# Chapter-by-chapter status (ISO 27001)
chapter_4_status = Column(String(20)) # Context
chapter_5_status = Column(String(20)) # Leadership
chapter_6_status = Column(String(20)) # Planning
chapter_7_status = Column(String(20)) # Support
chapter_8_status = Column(String(20)) # Operation
chapter_9_status = Column(String(20)) # Performance
chapter_10_status = Column(String(20)) # Improvement
# Potential Major findings
potential_majors = Column(JSON) # List of {"check": "", "status": "", "recommendation": ""}
# Potential Minor findings
potential_minors = Column(JSON)
# Improvement opportunities
improvement_opportunities = Column(JSON)
# Scores
readiness_score = Column(Float) # 0-100
documentation_score = Column(Float)
implementation_score = Column(Float)
evidence_score = Column(Float)
# Recommendations
priority_actions = Column(JSON) # List of recommended actions before audit
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_readiness_date', 'check_date'),
Index('ix_readiness_status', 'overall_status'),
)
def __repr__(self):
return f"<ISMSReadiness {self.check_date}: {self.overall_status}>"
__all__ = [
"FindingTypeEnum",
"FindingStatusEnum",
"CAPATypeEnum",
"AuditFindingDB",
"CorrectiveActionDB",
"ManagementReviewDB",
"InternalAuditDB",
"AuditTrailDB",
"ISMSReadinessCheckDB",
]

View File

@@ -0,0 +1,323 @@
"""
ISMS Governance models (ISO 27001 Kapitel 4-6) — extracted from compliance/db/models.py.
Covers the documentation and planning side of the ISMS: scope, context,
policies, security objectives, and the Statement of Applicability. The audit
execution side (findings, CAPA, management reviews, internal audits, audit
trail, readiness checks) lives in ``isms_audit_models.py``.
Re-exported from ``compliance.db.models`` for backwards compatibility.
DO NOT change __tablename__, column names, or relationship strings — the
database schema is frozen.
"""
import uuid
import enum
from datetime import datetime, date, timezone
from sqlalchemy import (
Column, String, Text, Integer, Boolean, DateTime, Date,
ForeignKey, Enum, JSON, Index,
)
from classroom_engine.database import Base
# ============================================================================
# SHARED GOVERNANCE ENUMS
# ============================================================================
class ApprovalStatusEnum(str, enum.Enum):
"""Approval status for ISMS documents."""
DRAFT = "draft"
UNDER_REVIEW = "under_review"
APPROVED = "approved"
SUPERSEDED = "superseded"
# ============================================================================
# MODELS
# ============================================================================
class ISMSScopeDB(Base):
"""
ISMS Scope Definition (ISO 27001 Kapitel 4.3)
Defines the boundaries and applicability of the ISMS.
This is MANDATORY for certification.
"""
__tablename__ = 'compliance_isms_scope'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
version = Column(String(20), nullable=False, default="1.0")
# Scope definition
scope_statement = Column(Text, nullable=False) # Main scope text
included_locations = Column(JSON) # List of locations
included_processes = Column(JSON) # List of processes
included_services = Column(JSON) # List of services/products
excluded_items = Column(JSON) # Explicitly excluded items
exclusion_justification = Column(Text) # Why items are excluded
# Boundaries
organizational_boundary = Column(Text) # Legal entity, departments
physical_boundary = Column(Text) # Locations, networks
technical_boundary = Column(Text) # Systems, applications
# Approval
status = Column(Enum(ApprovalStatusEnum), default=ApprovalStatusEnum.DRAFT)
approved_by = Column(String(100))
approved_at = Column(DateTime)
approval_signature = Column(String(64)) # SHA-256 hash
# Validity
effective_date = Column(Date)
review_date = Column(Date) # Next mandatory review
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
created_by = Column(String(100))
updated_by = Column(String(100))
__table_args__ = (
Index('ix_isms_scope_status', 'status'),
)
def __repr__(self):
return f"<ISMSScope v{self.version} ({self.status.value})>"
class ISMSContextDB(Base):
"""
ISMS Context (ISO 27001 Kapitel 4.1, 4.2)
Documents internal/external issues and interested parties.
"""
__tablename__ = 'compliance_isms_context'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
version = Column(String(20), nullable=False, default="1.0")
# 4.1 Internal issues
internal_issues = Column(JSON) # List of {"issue": "", "impact": "", "treatment": ""}
# 4.1 External issues
external_issues = Column(JSON) # List of {"issue": "", "impact": "", "treatment": ""}
# 4.2 Interested parties
interested_parties = Column(JSON) # List of {"party": "", "requirements": [], "relevance": ""}
# Legal/regulatory requirements
regulatory_requirements = Column(JSON) # DSGVO, AI Act, etc.
contractual_requirements = Column(JSON) # Customer contracts
# Analysis
swot_strengths = Column(JSON)
swot_weaknesses = Column(JSON)
swot_opportunities = Column(JSON)
swot_threats = Column(JSON)
# Approval
status = Column(Enum(ApprovalStatusEnum), default=ApprovalStatusEnum.DRAFT)
approved_by = Column(String(100))
approved_at = Column(DateTime)
# Review
last_reviewed_at = Column(DateTime)
next_review_date = Column(Date)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
def __repr__(self):
return f"<ISMSContext v{self.version}>"
class ISMSPolicyDB(Base):
"""
ISMS Policies (ISO 27001 Kapitel 5.2)
Information security policy and sub-policies.
"""
__tablename__ = 'compliance_isms_policies'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
policy_id = Column(String(30), unique=True, nullable=False, index=True) # e.g., "POL-ISMS-001"
# Policy details
title = Column(String(200), nullable=False)
policy_type = Column(String(50), nullable=False) # "master", "operational", "technical"
description = Column(Text)
policy_text = Column(Text, nullable=False) # Full policy content
# Scope
applies_to = Column(JSON) # Roles, departments, systems
# Document control
version = Column(String(20), nullable=False, default="1.0")
status = Column(Enum(ApprovalStatusEnum), default=ApprovalStatusEnum.DRAFT)
# Approval chain
authored_by = Column(String(100))
reviewed_by = Column(String(100))
approved_by = Column(String(100)) # Must be top management
approved_at = Column(DateTime)
approval_signature = Column(String(64))
# Validity
effective_date = Column(Date)
review_frequency_months = Column(Integer, default=12)
next_review_date = Column(Date)
# References
parent_policy_id = Column(String(36), ForeignKey('compliance_isms_policies.id'))
related_controls = Column(JSON) # List of control_ids
# Document path
document_path = Column(String(500)) # Link to full document
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_policy_type_status', 'policy_type', 'status'),
)
def __repr__(self):
return f"<ISMSPolicy {self.policy_id}: {self.title}>"
class SecurityObjectiveDB(Base):
"""
Security Objectives (ISO 27001 Kapitel 6.2)
Measurable information security objectives.
"""
__tablename__ = 'compliance_security_objectives'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
objective_id = Column(String(30), unique=True, nullable=False, index=True) # e.g., "OBJ-001"
# Objective definition
title = Column(String(200), nullable=False)
description = Column(Text)
category = Column(String(50)) # "availability", "confidentiality", "integrity", "compliance"
# SMART criteria
specific = Column(Text) # What exactly
measurable = Column(Text) # How measured
achievable = Column(Text) # Is it realistic
relevant = Column(Text) # Why important
time_bound = Column(Text) # Deadline
# Metrics
kpi_name = Column(String(100))
kpi_target = Column(String(100)) # Target value
kpi_current = Column(String(100)) # Current value
kpi_unit = Column(String(50)) # %, count, score
measurement_frequency = Column(String(50)) # monthly, quarterly
# Responsibility
owner = Column(String(100))
accountable = Column(String(100)) # RACI: Accountable
# Status
status = Column(String(30), default="active") # active, achieved, not_achieved, cancelled
progress_percentage = Column(Integer, default=0)
# Timeline
target_date = Column(Date)
achieved_date = Column(Date)
# Linked items
related_controls = Column(JSON)
related_risks = Column(JSON)
# Approval
approved_by = Column(String(100))
approved_at = Column(DateTime)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_objective_status', 'status'),
Index('ix_objective_category', 'category'),
)
def __repr__(self):
return f"<SecurityObjective {self.objective_id}: {self.title}>"
class StatementOfApplicabilityDB(Base):
"""
Statement of Applicability (SoA) - ISO 27001 Anhang A Mapping
Documents which Annex A controls are applicable and why.
This is MANDATORY for certification.
"""
__tablename__ = 'compliance_soa'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
# ISO 27001:2022 Annex A reference
annex_a_control = Column(String(20), nullable=False, index=True) # e.g., "A.5.1"
annex_a_title = Column(String(300), nullable=False)
annex_a_category = Column(String(100)) # "Organizational", "People", "Physical", "Technological"
# Applicability decision
is_applicable = Column(Boolean, nullable=False)
applicability_justification = Column(Text, nullable=False) # MUST be documented
# Implementation status
implementation_status = Column(String(30), default="planned") # planned, partial, implemented, not_implemented
implementation_notes = Column(Text)
# Mapping to our controls
breakpilot_control_ids = Column(JSON) # List of our control_ids that address this
coverage_level = Column(String(20), default="full") # full, partial, planned
# Evidence
evidence_description = Column(Text)
evidence_ids = Column(JSON) # Links to EvidenceDB
# Risk-based justification (for exclusions)
risk_assessment_notes = Column(Text) # If not applicable, explain why
compensating_controls = Column(Text) # If partial, explain compensating measures
# Approval
reviewed_by = Column(String(100))
reviewed_at = Column(DateTime)
approved_by = Column(String(100))
approved_at = Column(DateTime)
# Version tracking
version = Column(String(20), default="1.0")
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
__table_args__ = (
Index('ix_soa_annex_control', 'annex_a_control', unique=True),
Index('ix_soa_applicable', 'is_applicable'),
Index('ix_soa_status', 'implementation_status'),
)
def __repr__(self):
return f"<SoA {self.annex_a_control}: {'Applicable' if self.is_applicable else 'N/A'}>"
__all__ = [
"ApprovalStatusEnum",
"ISMSScopeDB",
"ISMSContextDB",
"ISMSPolicyDB",
"SecurityObjectiveDB",
"StatementOfApplicabilityDB",
]

View File

@@ -10,7 +10,7 @@ Provides CRUD operations for ISO 27001 certification-related entities:
"""
import uuid
from datetime import datetime, date
from datetime import datetime, date, timezone
from typing import List, Optional, Dict, Any, Tuple
from sqlalchemy.orm import Session as DBSession
@@ -94,11 +94,11 @@ class ISMSScopeRepository:
import hashlib
scope.status = ApprovalStatusEnum.APPROVED
scope.approved_by = approved_by
scope.approved_at = datetime.utcnow()
scope.approved_at = datetime.now(timezone.utc)
scope.effective_date = effective_date
scope.review_date = review_date
scope.approval_signature = hashlib.sha256(
f"{scope.scope_statement}|{approved_by}|{datetime.utcnow().isoformat()}".encode()
f"{scope.scope_statement}|{approved_by}|{datetime.now(timezone.utc).isoformat()}".encode()
).hexdigest()
self.db.commit()
@@ -185,7 +185,7 @@ class ISMSPolicyRepository:
policy.status = ApprovalStatusEnum.APPROVED
policy.reviewed_by = reviewed_by
policy.approved_by = approved_by
policy.approved_at = datetime.utcnow()
policy.approved_at = datetime.now(timezone.utc)
policy.effective_date = effective_date
policy.next_review_date = date(
effective_date.year + (policy.review_frequency_months // 12),
@@ -193,7 +193,7 @@ class ISMSPolicyRepository:
effective_date.day
)
policy.approval_signature = hashlib.sha256(
f"{policy.policy_id}|{approved_by}|{datetime.utcnow().isoformat()}".encode()
f"{policy.policy_id}|{approved_by}|{datetime.now(timezone.utc).isoformat()}".encode()
).hexdigest()
self.db.commit()
@@ -472,7 +472,7 @@ class AuditFindingRepository:
finding.verification_method = verification_method
finding.verification_evidence = verification_evidence
finding.verified_by = closed_by
finding.verified_at = datetime.utcnow()
finding.verified_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(finding)
@@ -644,7 +644,7 @@ class ManagementReviewRepository:
review.status = "approved"
review.approved_by = approved_by
review.approved_at = datetime.utcnow()
review.approved_at = datetime.now(timezone.utc)
review.next_review_date = next_review_date
review.minutes_document_path = minutes_document_path
@@ -761,7 +761,7 @@ class AuditTrailRepository:
new_value=new_value,
change_summary=change_summary,
performed_by=performed_by,
performed_at=datetime.utcnow(),
performed_at=datetime.now(timezone.utc),
checksum=hashlib.sha256(
f"{entity_type}|{entity_id}|{action}|{performed_by}".encode()
).hexdigest(),

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,134 @@
"""
Regulation & Requirement models — extracted from compliance/db/models.py.
The foundational compliance aggregate: regulations (GDPR, AI Act, CRA, ...) and
the individual requirements they contain. Re-exported from
``compliance.db.models`` for backwards compatibility.
DO NOT change __tablename__, column names, or relationship strings.
"""
import uuid
import enum
from datetime import datetime, timezone
from sqlalchemy import (
Column, String, Text, Integer, Boolean, DateTime, Date,
ForeignKey, Enum, JSON, Index,
)
from sqlalchemy.orm import relationship
from classroom_engine.database import Base
# ============================================================================
# ENUMS
# ============================================================================
class RegulationTypeEnum(str, enum.Enum):
"""Type of regulation/standard."""
EU_REGULATION = "eu_regulation" # Directly applicable EU law
EU_DIRECTIVE = "eu_directive" # Requires national implementation
DE_LAW = "de_law" # German national law
BSI_STANDARD = "bsi_standard" # BSI technical guidelines
INDUSTRY_STANDARD = "industry_standard" # ISO, OWASP, etc.
# ============================================================================
# MODELS
# ============================================================================
class RegulationDB(Base):
"""
Represents a regulation, directive, or standard.
Examples: GDPR, AI Act, CRA, BSI-TR-03161
"""
__tablename__ = 'compliance_regulations'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
code = Column(String(20), unique=True, nullable=False, index=True) # e.g., "GDPR", "AIACT"
name = Column(String(200), nullable=False) # Short name
full_name = Column(Text) # Full official name
regulation_type = Column(Enum(RegulationTypeEnum), nullable=False)
source_url = Column(String(500)) # EUR-Lex URL or similar
local_pdf_path = Column(String(500)) # Local PDF if available
effective_date = Column(Date) # When it came into force
description = Column(Text) # Brief description
is_active = Column(Boolean, default=True)
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
requirements = relationship("RequirementDB", back_populates="regulation", cascade="all, delete-orphan")
def __repr__(self):
return f"<Regulation {self.code}: {self.name}>"
class RequirementDB(Base):
"""
Individual requirement from a regulation.
Examples: GDPR Art. 32(1)(a), AI Act Art. 9, BSI-TR O.Auth_1
"""
__tablename__ = 'compliance_requirements'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
regulation_id = Column(String(36), ForeignKey('compliance_regulations.id'), nullable=False, index=True)
# Requirement identification
article = Column(String(50), nullable=False) # e.g., "Art. 32", "O.Auth_1"
paragraph = Column(String(20)) # e.g., "(1)(a)"
requirement_id_external = Column(String(50)) # External ID (e.g., BSI ID)
title = Column(String(300), nullable=False) # Requirement title
description = Column(Text) # Brief description
requirement_text = Column(Text) # Original text from regulation
# Breakpilot-specific interpretation and implementation
breakpilot_interpretation = Column(Text) # How Breakpilot interprets this
implementation_status = Column(String(30), default="not_started") # not_started, in_progress, implemented, verified
implementation_details = Column(Text) # How we implemented it
code_references = Column(JSON) # List of {"file": "...", "line": ..., "description": "..."}
documentation_links = Column(JSON) # List of internal doc links
# Evidence for auditors
evidence_description = Column(Text) # What evidence proves compliance
evidence_artifacts = Column(JSON) # List of {"type": "...", "path": "...", "description": "..."}
# Audit-specific fields
auditor_notes = Column(Text) # Notes from auditor review
audit_status = Column(String(30), default="pending") # pending, in_review, approved, rejected
last_audit_date = Column(DateTime)
last_auditor = Column(String(100))
is_applicable = Column(Boolean, default=True) # Applicable to Breakpilot?
applicability_reason = Column(Text) # Why/why not applicable
priority = Column(Integer, default=2) # 1=Critical, 2=High, 3=Medium
# Source document reference
source_page = Column(Integer) # Page number in source document
source_section = Column(String(100)) # Section in source document
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
regulation = relationship("RegulationDB", back_populates="requirements")
control_mappings = relationship("ControlMappingDB", back_populates="requirement", cascade="all, delete-orphan")
__table_args__ = (
Index('ix_requirement_regulation_article', 'regulation_id', 'article'),
Index('ix_requirement_audit_status', 'audit_status'),
Index('ix_requirement_impl_status', 'implementation_status'),
)
def __repr__(self):
return f"<Requirement {self.article} {self.paragraph or ''}>"
__all__ = ["RegulationTypeEnum", "RegulationDB", "RequirementDB"]

View File

@@ -6,7 +6,7 @@ Provides CRUD operations and business logic queries for all compliance entities.
from __future__ import annotations
import uuid
from datetime import datetime, date
from datetime import datetime, date, timezone
from typing import List, Optional, Dict, Any
from sqlalchemy.orm import Session as DBSession, selectinload, joinedload
@@ -86,7 +86,7 @@ class RegulationRepository:
for key, value in kwargs.items():
if hasattr(regulation, key):
setattr(regulation, key, value)
regulation.updated_at = datetime.utcnow()
regulation.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(regulation)
return regulation
@@ -425,7 +425,7 @@ class ControlRepository:
control.status = status
if status_notes:
control.status_notes = status_notes
control.updated_at = datetime.utcnow()
control.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(control)
return control
@@ -435,10 +435,10 @@ class ControlRepository:
control = self.get_by_control_id(control_id)
if not control:
return None
control.last_reviewed_at = datetime.utcnow()
control.last_reviewed_at = datetime.now(timezone.utc)
from datetime import timedelta
control.next_review_at = datetime.utcnow() + timedelta(days=control.review_frequency_days)
control.updated_at = datetime.utcnow()
control.next_review_at = datetime.now(timezone.utc) + timedelta(days=control.review_frequency_days)
control.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(control)
return control
@@ -450,7 +450,7 @@ class ControlRepository:
.filter(
or_(
ControlDB.next_review_at is None,
ControlDB.next_review_at <= datetime.utcnow()
ControlDB.next_review_at <= datetime.now(timezone.utc)
)
)
.order_by(ControlDB.next_review_at)
@@ -624,7 +624,7 @@ class EvidenceRepository:
if not evidence:
return None
evidence.status = status
evidence.updated_at = datetime.utcnow()
evidence.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(evidence)
return evidence
@@ -749,7 +749,7 @@ class RiskRepository:
risk.residual_likelihood, risk.residual_impact
)
risk.updated_at = datetime.utcnow()
risk.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(risk)
return risk
@@ -860,9 +860,9 @@ class AuditExportRepository:
export.compliance_score = compliance_score
if status == ExportStatusEnum.COMPLETED:
export.completed_at = datetime.utcnow()
export.completed_at = datetime.now(timezone.utc)
export.updated_at = datetime.utcnow()
export.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(export)
return export
@@ -1156,11 +1156,11 @@ class AuditSessionRepository:
session.status = status
if status == AuditSessionStatusEnum.IN_PROGRESS and not session.started_at:
session.started_at = datetime.utcnow()
session.started_at = datetime.now(timezone.utc)
elif status == AuditSessionStatusEnum.COMPLETED:
session.completed_at = datetime.utcnow()
session.completed_at = datetime.now(timezone.utc)
session.updated_at = datetime.utcnow()
session.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(session)
return session
@@ -1183,7 +1183,7 @@ class AuditSessionRepository:
if completed_items is not None:
session.completed_items = completed_items
session.updated_at = datetime.utcnow()
session.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(session)
return session
@@ -1207,9 +1207,9 @@ class AuditSessionRepository:
total_requirements = query.scalar() or 0
session.status = AuditSessionStatusEnum.IN_PROGRESS
session.started_at = datetime.utcnow()
session.started_at = datetime.now(timezone.utc)
session.total_items = total_requirements
session.updated_at = datetime.utcnow()
session.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(session)
@@ -1344,7 +1344,7 @@ class AuditSignOffRepository:
if sign and signed_by:
signoff.create_signature(signed_by)
signoff.updated_at = datetime.utcnow()
signoff.updated_at = datetime.now(timezone.utc)
self.db.commit()
self.db.refresh(signoff)
@@ -1376,7 +1376,7 @@ class AuditSignOffRepository:
signoff.notes = notes
if sign and signed_by:
signoff.create_signature(signed_by)
signoff.updated_at = datetime.utcnow()
signoff.updated_at = datetime.now(timezone.utc)
else:
# Create new
signoff = AuditSignOffDB(
@@ -1416,7 +1416,7 @@ class AuditSignOffRepository:
).first()
if session:
session.completed_items = completed
session.updated_at = datetime.utcnow()
session.updated_at = datetime.now(timezone.utc)
self.db.commit()
def get_checklist(

View File

@@ -0,0 +1,176 @@
"""
Service Module Registry models — extracted from compliance/db/models.py.
Sprint 3: registry of all Breakpilot services/modules for compliance mapping,
per-module regulation applicability, and per-module risk aggregation.
Re-exported from ``compliance.db.models`` for backwards compatibility.
DO NOT change __tablename__, column names, or relationship strings.
"""
import uuid
import enum
from datetime import datetime, timezone
from sqlalchemy import (
Column, String, Text, Integer, Boolean, DateTime,
ForeignKey, Enum, JSON, Index, Float,
)
from sqlalchemy.orm import relationship
from classroom_engine.database import Base
# RiskLevelEnum is re-used across aggregates; sourced here from control_models.
from compliance.db.control_models import RiskLevelEnum # noqa: F401
# ============================================================================
# ENUMS
# ============================================================================
class ServiceTypeEnum(str, enum.Enum):
"""Type of Breakpilot service/module."""
BACKEND = "backend" # API/Backend services
DATABASE = "database" # Data storage
AI = "ai" # AI/ML services
COMMUNICATION = "communication" # Chat/Video/Messaging
STORAGE = "storage" # File/Object storage
INFRASTRUCTURE = "infrastructure" # Load balancer, reverse proxy
MONITORING = "monitoring" # Logging, metrics
SECURITY = "security" # Auth, encryption, secrets
class RelevanceLevelEnum(str, enum.Enum):
"""Relevance level of a regulation to a service."""
CRITICAL = "critical" # Non-compliance = shutdown
HIGH = "high" # Major risk
MEDIUM = "medium" # Moderate risk
LOW = "low" # Minor risk
# ============================================================================
# MODELS
# ============================================================================
class ServiceModuleDB(Base):
"""
Registry of all Breakpilot services/modules for compliance mapping.
Tracks which regulations apply to which services, enabling:
- Service-specific compliance views
- Aggregated risk per service
- Gap analysis by module
"""
__tablename__ = 'compliance_service_modules'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
name = Column(String(100), unique=True, nullable=False, index=True) # e.g., "consent-service"
display_name = Column(String(200), nullable=False) # e.g., "Go Consent Service"
description = Column(Text)
# Technical details
service_type = Column(Enum(ServiceTypeEnum), nullable=False)
port = Column(Integer) # Primary port (if applicable)
technology_stack = Column(JSON) # e.g., ["Go", "Gin", "PostgreSQL"]
repository_path = Column(String(500)) # e.g., "/consent-service"
docker_image = Column(String(200)) # e.g., "breakpilot-pwa-consent-service"
# Data categories handled
data_categories = Column(JSON) # e.g., ["personal_data", "consent_records"]
processes_pii = Column(Boolean, default=False) # Handles personally identifiable info?
processes_health_data = Column(Boolean, default=False) # Handles special category health data?
ai_components = Column(Boolean, default=False) # Contains AI/ML components?
# Status
is_active = Column(Boolean, default=True)
criticality = Column(String(20), default="medium") # "critical", "high", "medium", "low"
# Compliance aggregation
compliance_score = Column(Float) # Calculated score 0-100
last_compliance_check = Column(DateTime)
# Owner
owner_team = Column(String(100)) # e.g., "Backend Team"
owner_contact = Column(String(200)) # e.g., "backend@breakpilot.app"
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
regulation_mappings = relationship("ModuleRegulationMappingDB", back_populates="module", cascade="all, delete-orphan")
module_risks = relationship("ModuleRiskDB", back_populates="module", cascade="all, delete-orphan")
__table_args__ = (
Index('ix_module_type_active', 'service_type', 'is_active'),
)
def __repr__(self):
return f"<ServiceModule {self.name}: {self.display_name}>"
class ModuleRegulationMappingDB(Base):
"""
Maps services to applicable regulations with relevance level.
Enables filtering: "Show all GDPR requirements for consent-service"
"""
__tablename__ = 'compliance_module_regulations'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
module_id = Column(String(36), ForeignKey('compliance_service_modules.id'), nullable=False, index=True)
regulation_id = Column(String(36), ForeignKey('compliance_regulations.id'), nullable=False, index=True)
relevance_level = Column(Enum(RelevanceLevelEnum), nullable=False, default=RelevanceLevelEnum.MEDIUM)
notes = Column(Text) # Why this regulation applies
applicable_articles = Column(JSON) # List of specific articles that apply
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
module = relationship("ServiceModuleDB", back_populates="regulation_mappings")
regulation = relationship("RegulationDB")
__table_args__ = (
Index('ix_module_regulation', 'module_id', 'regulation_id', unique=True),
)
class ModuleRiskDB(Base):
"""
Service-specific risks aggregated from requirements and controls.
"""
__tablename__ = 'compliance_module_risks'
id = Column(String(36), primary_key=True, default=lambda: str(uuid.uuid4()))
module_id = Column(String(36), ForeignKey('compliance_service_modules.id'), nullable=False, index=True)
risk_id = Column(String(36), ForeignKey('compliance_risks.id'), nullable=False, index=True)
# Module-specific assessment
module_likelihood = Column(Integer) # 1-5, may differ from global
module_impact = Column(Integer) # 1-5, may differ from global
module_risk_level = Column(Enum(RiskLevelEnum))
assessment_notes = Column(Text) # Module-specific notes
# Timestamps
created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc), onupdate=lambda: datetime.now(timezone.utc))
# Relationships
module = relationship("ServiceModuleDB", back_populates="module_risks")
risk = relationship("RiskDB")
__table_args__ = (
Index('ix_module_risk', 'module_id', 'risk_id', unique=True),
)
__all__ = [
"ServiceTypeEnum",
"RelevanceLevelEnum",
"ServiceModuleDB",
"ModuleRegulationMappingDB",
"ModuleRiskDB",
]

View File

@@ -0,0 +1,30 @@
"""Domain layer: value objects, enums, and domain exceptions.
Pure Python — no FastAPI, no SQLAlchemy, no HTTP concerns. Upper layers depend on
this package; it depends on nothing except the standard library and small libraries
like ``pydantic`` or ``attrs``.
"""
class DomainError(Exception):
"""Base class for all domain-level errors.
Services raise subclasses of this; the HTTP layer is responsible for mapping
them to status codes. Never raise ``HTTPException`` from a service.
"""
class NotFoundError(DomainError):
"""Requested entity does not exist."""
class ConflictError(DomainError):
"""Operation conflicts with the current state (e.g. duplicate, stale version)."""
class ValidationError(DomainError):
"""Input failed domain-level validation (beyond what Pydantic catches)."""
class PermissionError(DomainError):
"""Caller lacks permission for the operation."""

View File

@@ -0,0 +1,10 @@
"""Repository layer: database access.
Each aggregate gets its own module (e.g. ``dsr_repository.py``) exposing a single
class with intent-named methods. Repositories own SQLAlchemy session usage; they
do not run business logic, and they do not import anything from
``compliance.api`` or ``compliance.services``.
Phase 1 refactor target: ``compliance.db.repository`` (1547 lines) is being
decomposed into per-aggregate modules under this package.
"""

View File

@@ -0,0 +1,11 @@
"""Pydantic schemas, split per domain.
Phase 1 refactor target: the monolithic ``compliance.api.schemas`` module (1899 lines)
is being decomposed into one module per domain under this package. Until every domain
has been migrated, ``compliance.api.schemas`` re-exports from here so existing imports
continue to work unchanged.
New code MUST import from the specific domain module (e.g.
``from compliance.schemas.dsr import DSRRequestCreate``) rather than from
``compliance.api.schemas``.
"""

View File

@@ -16,7 +16,7 @@ Uses reportlab for PDF generation (lightweight, no external dependencies).
import io
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Dict, List, Any, Optional, Tuple
from sqlalchemy.orm import Session
@@ -255,7 +255,7 @@ class AuditPDFGenerator:
doc.build(story)
# Generate filename
date_str = datetime.utcnow().strftime('%Y%m%d')
date_str = datetime.now(timezone.utc).strftime('%Y%m%d')
filename = f"audit_report_{session.name.replace(' ', '_')}_{date_str}.pdf"
return buffer.getvalue(), filename
@@ -429,7 +429,7 @@ class AuditPDFGenerator:
story.append(Spacer(1, 30*mm))
gen_label = 'Generiert am' if language == 'de' else 'Generated on'
story.append(Paragraph(
f"{gen_label}: {datetime.utcnow().strftime('%d.%m.%Y %H:%M')} UTC",
f"{gen_label}: {datetime.now(timezone.utc).strftime('%d.%m.%Y %H:%M')} UTC",
self.styles['Footer']
))

View File

@@ -11,7 +11,7 @@ Sprint 6: CI/CD Evidence Collection (2026-01-18)
"""
import logging
from datetime import datetime
from datetime import datetime, timezone
from typing import Dict, List, Optional
from dataclasses import dataclass
from enum import Enum
@@ -140,7 +140,7 @@ class AutoRiskUpdater:
if new_status != old_status:
control.status = ControlStatusEnum(new_status)
control.status_notes = self._generate_status_notes(scan_result)
control.updated_at = datetime.utcnow()
control.updated_at = datetime.now(timezone.utc)
control_updated = True
logger.info(f"Control {scan_result.control_id} status changed: {old_status} -> {new_status}")
@@ -225,7 +225,7 @@ class AutoRiskUpdater:
source="ci_pipeline",
ci_job_id=scan_result.ci_job_id,
status=EvidenceStatusEnum.VALID,
valid_from=datetime.utcnow(),
valid_from=datetime.now(timezone.utc),
collected_at=scan_result.timestamp,
)
@@ -298,8 +298,8 @@ class AutoRiskUpdater:
risk_updated = True
if risk_updated:
risk.last_assessed_at = datetime.utcnow()
risk.updated_at = datetime.utcnow()
risk.last_assessed_at = datetime.now(timezone.utc)
risk.updated_at = datetime.now(timezone.utc)
affected_risks.append(risk.risk_id)
logger.info(f"Updated risk {risk.risk_id} due to control {control.control_id} status change")
@@ -354,7 +354,7 @@ class AutoRiskUpdater:
try:
ts = datetime.fromisoformat(timestamp.replace('Z', '+00:00'))
except (ValueError, AttributeError):
ts = datetime.utcnow()
ts = datetime.now(timezone.utc)
# Determine scan type from evidence_type
scan_type = ScanType.SAST # Default

View File

@@ -16,7 +16,7 @@ import os
import shutil
import tempfile
import zipfile
from datetime import datetime, date
from datetime import datetime, date, timezone
from pathlib import Path
from typing import Dict, List, Optional, Any
@@ -98,7 +98,7 @@ class AuditExportGenerator:
export_record.file_hash = file_hash
export_record.file_size_bytes = file_size
export_record.status = ExportStatusEnum.COMPLETED
export_record.completed_at = datetime.utcnow()
export_record.completed_at = datetime.now(timezone.utc)
# Calculate statistics
stats = self._calculate_statistics(

View File

@@ -11,7 +11,7 @@ Similar pattern to edu-search and zeugnisse-crawler.
import logging
import re
from datetime import datetime
from datetime import datetime, timezone
from typing import Dict, List, Any, Optional
from enum import Enum
@@ -198,7 +198,7 @@ class RegulationScraperService:
async def scrape_all(self) -> Dict[str, Any]:
"""Scrape all known regulation sources."""
self.status = ScraperStatus.RUNNING
self.stats["last_run"] = datetime.utcnow().isoformat()
self.stats["last_run"] = datetime.now(timezone.utc).isoformat()
results = {
"success": [],

View File

@@ -11,7 +11,7 @@ Reports include:
"""
import logging
from datetime import datetime, date, timedelta
from datetime import datetime, date, timedelta, timezone
from typing import Dict, List, Any, Optional
from enum import Enum
@@ -75,7 +75,7 @@ class ComplianceReportGenerator:
report = {
"report_metadata": {
"generated_at": datetime.utcnow().isoformat(),
"generated_at": datetime.now(timezone.utc).isoformat(),
"period": period.value,
"as_of_date": as_of_date.isoformat(),
"date_range_start": date_range["start"].isoformat(),
@@ -415,7 +415,7 @@ class ComplianceReportGenerator:
evidence_stats = self.evidence_repo.get_statistics()
return {
"generated_at": datetime.utcnow().isoformat(),
"generated_at": datetime.now(timezone.utc).isoformat(),
"compliance_score": stats.get("compliance_score", 0),
"controls": {
"total": stats.get("total", 0),

View File

@@ -8,7 +8,7 @@ Run with: pytest backend/compliance/tests/test_audit_routes.py -v
import pytest
import hashlib
from datetime import datetime
from datetime import datetime, timezone
from unittest.mock import MagicMock
from uuid import uuid4
@@ -78,7 +78,7 @@ def sample_session():
completed_items=0,
compliant_count=0,
non_compliant_count=0,
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -94,7 +94,7 @@ def sample_signoff(sample_session, sample_requirement):
signature_hash=None,
signed_at=None,
signed_by=None,
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -214,7 +214,7 @@ class TestAuditSessionLifecycle:
assert sample_session.status == AuditSessionStatusEnum.DRAFT
sample_session.status = AuditSessionStatusEnum.IN_PROGRESS
sample_session.started_at = datetime.utcnow()
sample_session.started_at = datetime.now(timezone.utc)
assert sample_session.status == AuditSessionStatusEnum.IN_PROGRESS
assert sample_session.started_at is not None
@@ -231,7 +231,7 @@ class TestAuditSessionLifecycle:
sample_session.status = AuditSessionStatusEnum.IN_PROGRESS
sample_session.status = AuditSessionStatusEnum.COMPLETED
sample_session.completed_at = datetime.utcnow()
sample_session.completed_at = datetime.now(timezone.utc)
assert sample_session.status == AuditSessionStatusEnum.COMPLETED
assert sample_session.completed_at is not None
@@ -353,7 +353,7 @@ class TestSignOff:
def test_signoff_with_signature_creates_hash(self, sample_session, sample_requirement):
"""Signing off with signature should create SHA-256 hash."""
result = AuditResultEnum.COMPLIANT
timestamp = datetime.utcnow().isoformat()
timestamp = datetime.now(timezone.utc).isoformat()
data = f"{result.value}|{sample_requirement.id}|{sample_session.auditor_name}|{timestamp}"
signature_hash = hashlib.sha256(data.encode()).hexdigest()
@@ -382,7 +382,7 @@ class TestSignOff:
# First sign-off should trigger auto-start
sample_session.status = AuditSessionStatusEnum.IN_PROGRESS
sample_session.started_at = datetime.utcnow()
sample_session.started_at = datetime.now(timezone.utc)
assert sample_session.status == AuditSessionStatusEnum.IN_PROGRESS
@@ -390,7 +390,7 @@ class TestSignOff:
"""Updating an existing sign-off should work."""
sample_signoff.result = AuditResultEnum.NON_COMPLIANT
sample_signoff.notes = "Updated: needs improvement"
sample_signoff.updated_at = datetime.utcnow()
sample_signoff.updated_at = datetime.now(timezone.utc)
assert sample_signoff.result == AuditResultEnum.NON_COMPLIANT
assert "Updated" in sample_signoff.notes
@@ -423,7 +423,7 @@ class TestGetSignOff:
# With signature
sample_signoff.signature_hash = "abc123"
sample_signoff.signed_at = datetime.utcnow()
sample_signoff.signed_at = datetime.now(timezone.utc)
sample_signoff.signed_by = "Test Auditor"
assert sample_signoff.signature_hash == "abc123"

View File

@@ -4,7 +4,7 @@ Tests for the AutoRiskUpdater Service.
Sprint 6: CI/CD Evidence Collection & Automatic Risk Updates (2026-01-18)
"""
from datetime import datetime
from datetime import datetime, timezone
from unittest.mock import MagicMock
from ..services.auto_risk_updater import (
@@ -188,7 +188,7 @@ class TestGenerateAlerts:
scan_result = ScanResult(
scan_type=ScanType.DEPENDENCY,
tool="Trivy",
timestamp=datetime.utcnow(),
timestamp=datetime.now(timezone.utc),
commit_sha="abc123",
branch="main",
control_id="SDLC-002",
@@ -209,7 +209,7 @@ class TestGenerateAlerts:
scan_result = ScanResult(
scan_type=ScanType.SAST,
tool="Semgrep",
timestamp=datetime.utcnow(),
timestamp=datetime.now(timezone.utc),
commit_sha="def456",
branch="main",
control_id="SDLC-001",
@@ -228,7 +228,7 @@ class TestGenerateAlerts:
scan_result = ScanResult(
scan_type=ScanType.CONTAINER,
tool="Trivy",
timestamp=datetime.utcnow(),
timestamp=datetime.now(timezone.utc),
commit_sha="ghi789",
branch="main",
control_id="SDLC-006",
@@ -247,7 +247,7 @@ class TestGenerateAlerts:
scan_result = ScanResult(
scan_type=ScanType.SAST,
tool="Semgrep",
timestamp=datetime.utcnow(),
timestamp=datetime.now(timezone.utc),
commit_sha="jkl012",
branch="main",
control_id="SDLC-001",
@@ -369,7 +369,7 @@ class TestScanResult:
result = ScanResult(
scan_type=ScanType.DEPENDENCY,
tool="Trivy",
timestamp=datetime.utcnow(),
timestamp=datetime.now(timezone.utc),
commit_sha="xyz789",
branch="develop",
control_id="SDLC-002",

View File

@@ -8,7 +8,7 @@ Run with: pytest compliance/tests/test_compliance_routes.py -v
"""
import pytest
from datetime import datetime
from datetime import datetime, timezone
from unittest.mock import MagicMock
from uuid import uuid4
@@ -41,8 +41,8 @@ def sample_regulation():
name="Datenschutz-Grundverordnung",
full_name="Verordnung (EU) 2016/679",
is_active=True,
created_at=datetime.utcnow(),
updated_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
updated_at=datetime.now(timezone.utc),
)
@@ -57,8 +57,8 @@ def sample_requirement(sample_regulation):
description="Personenbezogene Daten duerfen nur verarbeitet werden, wenn eine Rechtsgrundlage vorliegt.",
priority=4,
is_applicable=True,
created_at=datetime.utcnow(),
updated_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
updated_at=datetime.now(timezone.utc),
)
@@ -74,8 +74,8 @@ def sample_ai_system():
classification=AIClassificationEnum.UNCLASSIFIED,
status=AISystemStatusEnum.DRAFT,
obligations=[],
created_at=datetime.utcnow(),
updated_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
updated_at=datetime.now(timezone.utc),
)
@@ -96,8 +96,8 @@ class TestCreateRequirement:
description="Geeignete technische Massnahmen",
priority=3,
is_applicable=True,
created_at=datetime.utcnow(),
updated_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
updated_at=datetime.now(timezone.utc),
)
assert req.regulation_id == sample_regulation.id
@@ -196,7 +196,7 @@ class TestUpdateRequirement:
def test_update_audit_status_sets_audit_date(self, sample_requirement):
"""Updating audit_status should set last_audit_date."""
sample_requirement.audit_status = "compliant"
sample_requirement.last_audit_date = datetime.utcnow()
sample_requirement.last_audit_date = datetime.now(timezone.utc)
assert sample_requirement.audit_status == "compliant"
assert sample_requirement.last_audit_date is not None
@@ -287,7 +287,7 @@ class TestAISystemCRUD:
def test_update_ai_system_with_assessment(self, sample_ai_system):
"""After assessment, system should have assessment_date and result."""
sample_ai_system.assessment_date = datetime.utcnow()
sample_ai_system.assessment_date = datetime.now(timezone.utc)
sample_ai_system.assessment_result = {
"overall_risk": "high",
"risk_factors": [{"factor": "education sector", "severity": "high"}],

View File

@@ -15,7 +15,7 @@ Run with: pytest backend/compliance/tests/test_isms_routes.py -v
"""
import pytest
from datetime import datetime, date
from datetime import datetime, date, timezone
from unittest.mock import MagicMock
from uuid import uuid4
@@ -56,7 +56,7 @@ def sample_scope():
status=ApprovalStatusEnum.DRAFT,
version="1.0",
created_by="admin@breakpilot.de",
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -65,7 +65,7 @@ def sample_approved_scope(sample_scope):
"""Create an approved ISMS scope for testing."""
sample_scope.status = ApprovalStatusEnum.APPROVED
sample_scope.approved_by = "ceo@breakpilot.de"
sample_scope.approved_at = datetime.utcnow()
sample_scope.approved_at = datetime.now(timezone.utc)
sample_scope.effective_date = date.today()
sample_scope.review_date = date(date.today().year + 1, date.today().month, date.today().day)
sample_scope.approval_signature = "sha256_signature_hash"
@@ -88,7 +88,7 @@ def sample_policy():
authored_by="iso@breakpilot.de",
status=ApprovalStatusEnum.DRAFT,
version="1.0",
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -116,7 +116,7 @@ def sample_objective():
related_controls=["OPS-003"],
status="active",
progress_percentage=0.0,
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -136,7 +136,7 @@ def sample_soa_entry():
coverage_level="full",
evidence_description="ISMS Policy v2.0, signed by CEO",
version="1.0",
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -158,7 +158,7 @@ def sample_finding():
identified_date=date.today(),
due_date=date(2026, 3, 31),
status=FindingStatusEnum.OPEN,
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -178,7 +178,7 @@ def sample_major_finding():
identified_date=date.today(),
due_date=date(2026, 2, 28),
status=FindingStatusEnum.OPEN,
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -198,7 +198,7 @@ def sample_capa(sample_finding):
planned_completion=date(2026, 2, 15),
effectiveness_criteria="Document approved and distributed to audit team",
status="planned",
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -219,7 +219,7 @@ def sample_management_review():
{"name": "ISO", "role": "ISMS Manager"},
],
status="draft",
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -239,7 +239,7 @@ def sample_internal_audit():
lead_auditor="internal.auditor@breakpilot.de",
audit_team=["internal.auditor@breakpilot.de", "qa@breakpilot.de"],
status="planned",
created_at=datetime.utcnow(),
created_at=datetime.now(timezone.utc),
)
@@ -502,7 +502,7 @@ class TestISMSReadinessCheck:
"""Readiness check should identify potential major findings."""
check = ISMSReadinessCheckDB(
id=str(uuid4()),
check_date=datetime.utcnow(),
check_date=datetime.now(timezone.utc),
triggered_by="admin@breakpilot.de",
overall_status="not_ready",
certification_possible=False,
@@ -532,7 +532,7 @@ class TestISMSReadinessCheck:
"""Readiness check should show status for each ISO chapter."""
check = ISMSReadinessCheckDB(
id=str(uuid4()),
check_date=datetime.utcnow(),
check_date=datetime.now(timezone.utc),
triggered_by="admin@breakpilot.de",
overall_status="ready",
certification_possible=True,
@@ -606,7 +606,7 @@ class TestAuditTrail:
entity_name="ISMS Scope v1.0",
action="approve",
performed_by="ceo@breakpilot.de",
performed_at=datetime.utcnow(),
performed_at=datetime.now(timezone.utc),
checksum="sha256_hash",
)
@@ -630,7 +630,7 @@ class TestAuditTrail:
new_value="approved",
change_summary="Policy approved by CEO",
performed_by="ceo@breakpilot.de",
performed_at=datetime.utcnow(),
performed_at=datetime.now(timezone.utc),
checksum="sha256_hash",
)

View File

@@ -5,7 +5,7 @@ Kommuniziert mit dem Consent Management Service für GDPR-Compliance
import httpx
import jwt
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone
from typing import Optional, List, Dict, Any
from dataclasses import dataclass
from enum import Enum
@@ -44,8 +44,8 @@ def generate_jwt_token(
"user_id": user_id,
"email": email,
"role": role,
"exp": datetime.utcnow() + timedelta(hours=expires_hours),
"iat": datetime.utcnow(),
"exp": datetime.now(timezone.utc) + timedelta(hours=expires_hours),
"iat": datetime.now(timezone.utc),
}
return jwt.encode(payload, JWT_SECRET, algorithm="HS256")

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,25 @@
#!/usr/bin/env python3
"""Regenerate the OpenAPI baseline.
Run this ONLY when you have intentionally made an additive API change and want
the contract test to pick up the new baseline. Removing or renaming anything is
a breaking change and requires updating every consumer in the same change set.
Usage:
python tests/contracts/regenerate_baseline.py
"""
from __future__ import annotations
import json
import sys
from pathlib import Path
THIS_DIR = Path(__file__).parent
REPO_ROOT = THIS_DIR.parent.parent # backend-compliance/
sys.path.insert(0, str(REPO_ROOT))
from main import app # type: ignore[import-not-found] # noqa: E402
out = THIS_DIR / "openapi.baseline.json"
out.write_text(json.dumps(app.openapi(), indent=2, sort_keys=True) + "\n")
print(f"wrote {out}")

View File

@@ -0,0 +1,102 @@
"""OpenAPI contract test.
This test pins the public HTTP contract of backend-compliance. It loads the
FastAPI app, extracts the live OpenAPI schema, and compares it against a
checked-in baseline at ``tests/contracts/openapi.baseline.json``.
Rules:
- Adding new paths/operations/fields → OK (additive change).
- Removing a path, changing a method, changing a status code, removing or
renaming a response/request field → FAIL. Such changes require updating
every consumer (admin-compliance, developer-portal, SDKs) in the same
change, then regenerating the baseline with:
python tests/contracts/regenerate_baseline.py
and explaining the contract change in the PR description.
The baseline is missing on first run — the test prints the command to create
it and skips. This is intentional: Phase 1 step 1 generates it fresh from the
current app state before any refactoring begins.
"""
from __future__ import annotations
import json
from pathlib import Path
from typing import Any
import pytest
BASELINE_PATH = Path(__file__).parent / "openapi.baseline.json"
def _load_live_schema() -> dict[str, Any]:
"""Import the FastAPI app and extract its OpenAPI schema.
Kept inside the function so that test collection does not fail if the app
has import-time side effects that aren't satisfied in the test env.
"""
from main import app # type: ignore[import-not-found]
return app.openapi()
def _collect_operations(schema: dict[str, Any]) -> dict[str, dict[str, Any]]:
"""Return a flat {f'{METHOD} {path}': operation} map for diffing."""
out: dict[str, dict[str, Any]] = {}
for path, methods in schema.get("paths", {}).items():
for method, op in methods.items():
if method.lower() in {"get", "post", "put", "patch", "delete", "options", "head"}:
out[f"{method.upper()} {path}"] = op
return out
@pytest.mark.contract
def test_openapi_no_breaking_changes() -> None:
if not BASELINE_PATH.exists():
pytest.skip(
f"Baseline missing. Run: python {Path(__file__).parent}/regenerate_baseline.py"
)
baseline = json.loads(BASELINE_PATH.read_text())
live = _load_live_schema()
baseline_ops = _collect_operations(baseline)
live_ops = _collect_operations(live)
# 1. No operation may disappear.
removed = sorted(set(baseline_ops) - set(live_ops))
assert not removed, (
f"Breaking change: {len(removed)} operation(s) removed from public API:\n "
+ "\n ".join(removed)
)
# 2. For operations that exist in both, response status codes must be a superset.
for key, baseline_op in baseline_ops.items():
live_op = live_ops[key]
baseline_codes = set((baseline_op.get("responses") or {}).keys())
live_codes = set((live_op.get("responses") or {}).keys())
missing = baseline_codes - live_codes
assert not missing, (
f"Breaking change: {key} no longer returns status code(s) {sorted(missing)}"
)
# 3. Required request-body fields may not be added (would break existing clients).
for key, baseline_op in baseline_ops.items():
live_op = live_ops[key]
base_req = _required_body_fields(baseline_op)
live_req = _required_body_fields(live_op)
new_required = live_req - base_req
assert not new_required, (
f"Breaking change: {key} added required request field(s) {sorted(new_required)}"
)
def _required_body_fields(op: dict[str, Any]) -> set[str]:
rb = op.get("requestBody") or {}
content = rb.get("content") or {}
for media in content.values():
schema = media.get("schema") or {}
return set(schema.get("required") or [])
return set()

View File

@@ -10,7 +10,7 @@ import pytest
import uuid
import os
import sys
from datetime import datetime
from datetime import datetime, timezone
from unittest.mock import MagicMock
from fastapi import FastAPI
@@ -51,7 +51,7 @@ _RawSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
@event.listens_for(engine, "connect")
def _register_sqlite_functions(dbapi_conn, connection_record):
"""Register PostgreSQL-compatible functions for SQLite."""
dbapi_conn.create_function("NOW", 0, lambda: datetime.utcnow().isoformat())
dbapi_conn.create_function("NOW", 0, lambda: datetime.now(timezone.utc).isoformat())
TENANT_ID = "default"

View File

@@ -6,7 +6,7 @@ Pattern: app.dependency_overrides[get_db] for FastAPI DI.
import uuid
import os
import sys
from datetime import datetime, timedelta
from datetime import datetime, timedelta, timezone
import pytest
from fastapi import FastAPI
@@ -75,7 +75,7 @@ def db_session():
def _create_dsr_in_db(db, **kwargs):
"""Helper to create a DSR directly in DB."""
now = datetime.utcnow()
now = datetime.now(timezone.utc)
defaults = {
"tenant_id": uuid.UUID(TENANT_ID),
"request_number": f"DSR-2026-{str(uuid.uuid4())[:6].upper()}",
@@ -241,8 +241,8 @@ class TestListDSR:
assert len(data["requests"]) == 2
def test_list_overdue_only(self, db_session):
_create_dsr_in_db(db_session, deadline_at=datetime.utcnow() - timedelta(days=5), status="processing")
_create_dsr_in_db(db_session, deadline_at=datetime.utcnow() + timedelta(days=20), status="processing")
_create_dsr_in_db(db_session, deadline_at=datetime.now(timezone.utc) - timedelta(days=5), status="processing")
_create_dsr_in_db(db_session, deadline_at=datetime.now(timezone.utc) + timedelta(days=20), status="processing")
resp = client.get("/api/compliance/dsr?overdue_only=true", headers=HEADERS)
assert resp.status_code == 200
@@ -339,7 +339,7 @@ class TestDSRStats:
_create_dsr_in_db(db_session, status="intake", request_type="access")
_create_dsr_in_db(db_session, status="processing", request_type="erasure")
_create_dsr_in_db(db_session, status="completed", request_type="access",
completed_at=datetime.utcnow())
completed_at=datetime.now(timezone.utc))
resp = client.get("/api/compliance/dsr/stats", headers=HEADERS)
assert resp.status_code == 200
@@ -561,9 +561,9 @@ class TestDeadlineProcessing:
def test_process_deadlines_with_overdue(self, db_session):
_create_dsr_in_db(db_session, status="processing",
deadline_at=datetime.utcnow() - timedelta(days=5))
deadline_at=datetime.now(timezone.utc) - timedelta(days=5))
_create_dsr_in_db(db_session, status="processing",
deadline_at=datetime.utcnow() + timedelta(days=20))
deadline_at=datetime.now(timezone.utc) + timedelta(days=20))
resp = client.post("/api/compliance/dsr/deadlines/process", headers=HEADERS)
assert resp.status_code == 200
@@ -609,7 +609,7 @@ class TestDSRTemplates:
subject="Bestaetigung",
body_html="<p>Test</p>",
status="published",
published_at=datetime.utcnow(),
published_at=datetime.now(timezone.utc),
)
db_session.add(v)
db_session.commit()

View File

@@ -7,7 +7,7 @@ Consent widerrufen, Statistiken.
import pytest
from unittest.mock import MagicMock, patch
from datetime import datetime
from datetime import datetime, timezone
import uuid
@@ -25,7 +25,7 @@ def make_catalog(tenant_id='test-tenant'):
rec.tenant_id = tenant_id
rec.selected_data_point_ids = ['dp-001', 'dp-002']
rec.custom_data_points = []
rec.updated_at = datetime.utcnow()
rec.updated_at = datetime.now(timezone.utc)
return rec
@@ -34,7 +34,7 @@ def make_company(tenant_id='test-tenant'):
rec.id = uuid.uuid4()
rec.tenant_id = tenant_id
rec.data = {'company_name': 'Test GmbH', 'email': 'datenschutz@test.de'}
rec.updated_at = datetime.utcnow()
rec.updated_at = datetime.now(timezone.utc)
return rec
@@ -47,7 +47,7 @@ def make_cookies(tenant_id='test-tenant'):
{'id': 'analytics', 'name': 'Analyse', 'isRequired': False, 'defaultEnabled': False},
]
rec.config = {'position': 'bottom', 'style': 'bar'}
rec.updated_at = datetime.utcnow()
rec.updated_at = datetime.now(timezone.utc)
return rec
@@ -58,13 +58,13 @@ def make_consent(tenant_id='test-tenant', user_id='user-001', data_point_id='dp-
rec.user_id = user_id
rec.data_point_id = data_point_id
rec.granted = granted
rec.granted_at = datetime.utcnow()
rec.granted_at = datetime.now(timezone.utc)
rec.revoked_at = None
rec.consent_version = '1.0'
rec.source = 'website'
rec.ip_address = None
rec.user_agent = None
rec.created_at = datetime.utcnow()
rec.created_at = datetime.now(timezone.utc)
return rec
@@ -263,7 +263,7 @@ class TestConsentDB:
user_id='user-001',
data_point_id='dp-marketing',
granted=True,
granted_at=datetime.utcnow(),
granted_at=datetime.now(timezone.utc),
consent_version='1.0',
source='website',
)
@@ -276,13 +276,13 @@ class TestConsentDB:
consent = make_consent()
assert consent.revoked_at is None
consent.revoked_at = datetime.utcnow()
consent.revoked_at = datetime.now(timezone.utc)
assert consent.revoked_at is not None
def test_cannot_revoke_already_revoked(self):
"""Should not be possible to revoke an already revoked consent."""
consent = make_consent()
consent.revoked_at = datetime.utcnow()
consent.revoked_at = datetime.now(timezone.utc)
# Simulate the guard logic from the route
already_revoked = consent.revoked_at is not None
@@ -315,7 +315,7 @@ class TestConsentStats:
make_consent(user_id='user-2', data_point_id='dp-1', granted=True),
]
# Revoke one
consents[1].revoked_at = datetime.utcnow()
consents[1].revoked_at = datetime.now(timezone.utc)
total = len(consents)
active = sum(1 for c in consents if c.granted and not c.revoked_at)
@@ -334,7 +334,7 @@ class TestConsentStats:
make_consent(user_id='user-2', granted=True),
make_consent(user_id='user-3', granted=True),
]
consents[2].revoked_at = datetime.utcnow() # user-3 revoked
consents[2].revoked_at = datetime.now(timezone.utc) # user-3 revoked
unique_users = len(set(c.user_id for c in consents))
users_with_active = len(set(c.user_id for c in consents if c.granted and not c.revoked_at))
@@ -501,7 +501,7 @@ class TestConsentHistoryTracking:
from compliance.db.einwilligungen_models import EinwilligungenConsentHistoryDB
consent = make_consent()
consent.revoked_at = datetime.utcnow()
consent.revoked_at = datetime.now(timezone.utc)
entry = EinwilligungenConsentHistoryDB(
consent_id=consent.id,
tenant_id=consent.tenant_id,
@@ -516,7 +516,7 @@ class TestConsentHistoryTracking:
entry_id = _uuid.uuid4()
consent_id = _uuid.uuid4()
now = datetime.utcnow()
now = datetime.now(timezone.utc)
row = {
"id": str(entry_id),

View File

@@ -13,7 +13,7 @@ Run with: cd backend-compliance && python3 -m pytest tests/test_isms_routes.py -
import os
import sys
import pytest
from datetime import date, datetime
from datetime import date, datetime, timezone
from fastapi import FastAPI
from fastapi.testclient import TestClient
@@ -40,7 +40,7 @@ def _set_sqlite_pragma(dbapi_conn, connection_record):
cursor = dbapi_conn.cursor()
cursor.execute("PRAGMA foreign_keys=ON")
cursor.close()
dbapi_conn.create_function("NOW", 0, lambda: datetime.utcnow().isoformat())
dbapi_conn.create_function("NOW", 0, lambda: datetime.now(timezone.utc).isoformat())
app = FastAPI()

View File

@@ -7,7 +7,7 @@ Rejection-Flow, approval history.
import pytest
from unittest.mock import MagicMock, patch
from datetime import datetime
from datetime import datetime, timezone
import uuid
@@ -27,7 +27,7 @@ def make_document(type='privacy_policy', name='Datenschutzerklärung', tenant_id
doc.name = name
doc.description = 'Test description'
doc.mandatory = False
doc.created_at = datetime.utcnow()
doc.created_at = datetime.now(timezone.utc)
doc.updated_at = None
return doc
@@ -46,7 +46,7 @@ def make_version(document_id=None, version='1.0', status='draft', title='Test Ve
v.approved_by = None
v.approved_at = None
v.rejection_reason = None
v.created_at = datetime.utcnow()
v.created_at = datetime.now(timezone.utc)
v.updated_at = None
return v
@@ -58,7 +58,7 @@ def make_approval(version_id=None, action='created'):
a.action = action
a.approver = 'admin@test.de'
a.comment = None
a.created_at = datetime.utcnow()
a.created_at = datetime.now(timezone.utc)
return a
@@ -179,7 +179,7 @@ class TestVersionToResponse:
from compliance.api.legal_document_routes import _version_to_response
v = make_version(status='approved')
v.approved_by = 'dpo@company.de'
v.approved_at = datetime.utcnow()
v.approved_at = datetime.now(timezone.utc)
resp = _version_to_response(v)
assert resp.status == 'approved'
assert resp.approved_by == 'dpo@company.de'
@@ -254,7 +254,7 @@ class TestApprovalWorkflow:
# Step 2: Approve
mock_db.reset_mock()
_transition(mock_db, str(v.id), ['review'], 'approved', 'approved', 'dpo', 'Korrekt',
extra_updates={'approved_by': 'dpo', 'approved_at': datetime.utcnow()})
extra_updates={'approved_by': 'dpo', 'approved_at': datetime.now(timezone.utc)})
assert v.status == 'approved'
# Step 3: Publish

View File

@@ -5,7 +5,7 @@ Tests for Legal Document extended routes (User Consents, Audit Log, Cookie Categ
import uuid
import os
import sys
from datetime import datetime
from datetime import datetime, timezone
import pytest
from fastapi import FastAPI
@@ -103,7 +103,7 @@ def _publish_version(version_id):
v = db.query(LegalDocumentVersionDB).filter(LegalDocumentVersionDB.id == vid).first()
v.status = "published"
v.approved_by = "admin"
v.approved_at = datetime.utcnow()
v.approved_at = datetime.now(timezone.utc)
db.commit()
db.refresh(v)
result = {"id": str(v.id), "status": v.status}

View File

@@ -15,7 +15,7 @@ import pytest
import uuid
import os
import sys
from datetime import datetime
from datetime import datetime, timezone
from fastapi import FastAPI
from fastapi.testclient import TestClient
@@ -40,7 +40,7 @@ TENANT_ID = "default"
@event.listens_for(engine, "connect")
def _register_sqlite_functions(dbapi_conn, connection_record):
dbapi_conn.create_function("NOW", 0, lambda: datetime.utcnow().isoformat())
dbapi_conn.create_function("NOW", 0, lambda: datetime.now(timezone.utc).isoformat())
class _DictRow(dict):

View File

@@ -186,7 +186,7 @@ class TestActivityToResponse:
act.next_review_at = kwargs.get("next_review_at", None)
act.created_by = kwargs.get("created_by", None)
act.dsfa_id = kwargs.get("dsfa_id", None)
act.created_at = datetime.utcnow()
act.created_at = datetime.now(timezone.utc)
act.updated_at = None
return act
@@ -330,7 +330,7 @@ class TestVVTConsolidationResponse:
act.next_review_at = kwargs.get("next_review_at", None)
act.created_by = kwargs.get("created_by", None)
act.dsfa_id = kwargs.get("dsfa_id", None)
act.created_at = datetime.utcnow()
act.created_at = datetime.now(timezone.utc)
act.updated_at = None
return act

View File

@@ -10,7 +10,7 @@ Verifies that:
import pytest
import uuid
from unittest.mock import MagicMock, AsyncMock, patch
from datetime import datetime
from datetime import datetime, timezone
from fastapi import HTTPException
from fastapi.testclient import TestClient
@@ -144,8 +144,8 @@ def _make_activity(tenant_id, vvt_id="VVT-001", name="Test", **kwargs):
act.next_review_at = None
act.created_by = "system"
act.dsfa_id = None
act.created_at = datetime.utcnow()
act.updated_at = datetime.utcnow()
act.created_at = datetime.now(timezone.utc)
act.updated_at = datetime.now(timezone.utc)
return act