Squash of branch refactor/phase0-guardrails-and-models-split — 4 commits,
81 files, 173/173 pytest green, OpenAPI contract preserved (360 paths /
484 operations).
## Phase 0 — Architecture guardrails
Three defense-in-depth layers to keep the architecture rules enforced
regardless of who opens Claude Code in this repo:
1. .claude/settings.json PreToolUse hook on Write/Edit blocks any file
that would exceed the 500-line hard cap. Auto-loads in every Claude
session in this repo.
2. scripts/githooks/pre-commit (install via scripts/install-hooks.sh)
enforces the LOC cap locally, freezes migrations/ without
[migration-approved], and protects guardrail files without
[guardrail-change].
3. .gitea/workflows/ci.yaml gains loc-budget + guardrail-integrity +
sbom-scan (syft+grype) jobs, adds mypy --strict for the new Python
packages (compliance/{services,repositories,domain,schemas}), and
tsc --noEmit for admin-compliance + developer-portal.
Per-language conventions documented in AGENTS.python.md, AGENTS.go.md,
AGENTS.typescript.md at the repo root — layering, tooling, and explicit
"what you may NOT do" lists. Root CLAUDE.md is prepended with the six
non-negotiable rules. Each of the 10 services gets a README.md.
scripts/check-loc.sh enforces soft 300 / hard 500 and surfaces the
current baseline of 205 hard + 161 soft violations so Phases 1-4 can
drain it incrementally. CI gates only CHANGED files in PRs so the
legacy baseline does not block unrelated work.
## Deprecation sweep
47 files. Pydantic V1 regex= -> pattern= (2 sites), class Config ->
ConfigDict in source_policy_router.py (schemas.py intentionally skipped;
it is the Phase 1 Step 3 split target). datetime.utcnow() ->
datetime.now(timezone.utc) everywhere including SQLAlchemy default=
callables. All DB columns already declare timezone=True, so this is a
latent-bug fix at the Python side, not a schema change.
DeprecationWarning count dropped from 158 to 35.
## Phase 1 Step 1 — Contract test harness
tests/contracts/test_openapi_baseline.py diffs the live FastAPI /openapi.json
against tests/contracts/openapi.baseline.json on every test run. Fails on
removed paths, removed status codes, or new required request body fields.
Regenerate only via tests/contracts/regenerate_baseline.py after a
consumer-updated contract change. This is the safety harness for all
subsequent refactor commits.
## Phase 1 Step 2 — models.py split (1466 -> 85 LOC shim)
compliance/db/models.py is decomposed into seven sibling aggregate modules
following the existing repo pattern (dsr_models.py, vvt_models.py, ...):
regulation_models.py (134) — Regulation, Requirement
control_models.py (279) — Control, Mapping, Evidence, Risk
ai_system_models.py (141) — AISystem, AuditExport
service_module_models.py (176) — ServiceModule, ModuleRegulation, ModuleRisk
audit_session_models.py (177) — AuditSession, AuditSignOff
isms_governance_models.py (323) — ISMSScope, Context, Policy, Objective, SoA
isms_audit_models.py (468) — Finding, CAPA, MgmtReview, InternalAudit,
AuditTrail, Readiness
models.py becomes an 85-line re-export shim in dependency order so
existing imports continue to work unchanged. Schema is byte-identical:
__tablename__, column definitions, relationship strings, back_populates,
cascade directives all preserved.
All new sibling files are under the 500-line hard cap; largest is
isms_audit_models.py at 468. No file in compliance/db/ now exceeds
the hard cap.
## Phase 1 Step 3 — infrastructure only
backend-compliance/compliance/{schemas,domain,repositories}/ packages
are created as landing zones with docstrings. compliance/domain/
exports DomainError / NotFoundError / ConflictError / ValidationError /
PermissionError — the base classes services will use to raise
domain-level errors instead of HTTPException.
PHASE1_RUNBOOK.md at backend-compliance/PHASE1_RUNBOOK.md documents
the nine-step execution plan for Phase 1: snapshot baseline,
characterization tests, split models.py (this commit), split schemas.py
(next), extract services, extract repositories, mypy --strict, coverage.
## Verification
backend-compliance/.venv-phase1: uv python install 3.12 + pip -r requirements.txt
PYTHONPATH=. pytest compliance/tests/ tests/contracts/
-> 173 passed, 0 failed, 35 warnings, OpenAPI 360/484 unchanged
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1177 lines
38 KiB
Python
1177 lines
38 KiB
Python
"""
|
|
DSR (Data Subject Request) Routes — Betroffenenanfragen nach DSGVO Art. 15-21.
|
|
|
|
Native Python/FastAPI Implementierung, ersetzt Go consent-service Proxy.
|
|
"""
|
|
|
|
import io
|
|
import csv
|
|
import uuid
|
|
from datetime import datetime, timedelta, timezone
|
|
from typing import Optional, List, Dict, Any
|
|
|
|
from fastapi import APIRouter, Depends, HTTPException, Query, Header
|
|
from fastapi.responses import StreamingResponse
|
|
from pydantic import BaseModel
|
|
from sqlalchemy.orm import Session
|
|
from sqlalchemy import text, func, and_, or_
|
|
|
|
from classroom_engine.database import get_db
|
|
from ..db.dsr_models import (
|
|
DSRRequestDB, DSRStatusHistoryDB, DSRCommunicationDB,
|
|
DSRTemplateDB, DSRTemplateVersionDB, DSRExceptionCheckDB,
|
|
)
|
|
|
|
router = APIRouter(prefix="/dsr", tags=["compliance-dsr"])
|
|
|
|
# Default-Tenant
|
|
DEFAULT_TENANT = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
|
|
|
# Art. 17(3) Ausnahmen
|
|
ART17_EXCEPTIONS = [
|
|
{
|
|
"check_code": "art17_3_a",
|
|
"article": "17(3)(a)",
|
|
"label": "Meinungs- und Informationsfreiheit",
|
|
"description": "Ausuebung des Rechts auf freie Meinungsaeusserung und Information",
|
|
},
|
|
{
|
|
"check_code": "art17_3_b",
|
|
"article": "17(3)(b)",
|
|
"label": "Rechtliche Verpflichtung",
|
|
"description": "Erfuellung einer rechtlichen Verpflichtung (z.B. Aufbewahrungspflichten)",
|
|
},
|
|
{
|
|
"check_code": "art17_3_c",
|
|
"article": "17(3)(c)",
|
|
"label": "Oeffentliches Interesse",
|
|
"description": "Gruende des oeffentlichen Interesses im Bereich Gesundheit",
|
|
},
|
|
{
|
|
"check_code": "art17_3_d",
|
|
"article": "17(3)(d)",
|
|
"label": "Archivzwecke",
|
|
"description": "Archivzwecke, wissenschaftliche/historische Forschung, Statistik",
|
|
},
|
|
{
|
|
"check_code": "art17_3_e",
|
|
"article": "17(3)(e)",
|
|
"label": "Rechtsansprueche",
|
|
"description": "Geltendmachung, Ausuebung oder Verteidigung von Rechtsanspruechen",
|
|
},
|
|
]
|
|
|
|
VALID_REQUEST_TYPES = ["access", "rectification", "erasure", "restriction", "portability", "objection"]
|
|
VALID_STATUSES = ["intake", "identity_verification", "processing", "completed", "rejected", "cancelled"]
|
|
VALID_PRIORITIES = ["low", "normal", "high", "critical"]
|
|
VALID_SOURCES = ["web_form", "email", "letter", "phone", "in_person", "other"]
|
|
|
|
# Deadline-Tage pro Typ (DSGVO Art. 12 Abs. 3)
|
|
DEADLINE_DAYS = {
|
|
"access": 30,
|
|
"rectification": 14,
|
|
"erasure": 14,
|
|
"restriction": 14,
|
|
"portability": 30,
|
|
"objection": 30,
|
|
}
|
|
|
|
|
|
# =============================================================================
|
|
# Pydantic Schemas
|
|
# =============================================================================
|
|
|
|
class DSRCreate(BaseModel):
|
|
request_type: str = "access"
|
|
requester_name: str
|
|
requester_email: str
|
|
requester_phone: Optional[str] = None
|
|
requester_address: Optional[str] = None
|
|
requester_customer_id: Optional[str] = None
|
|
source: str = "email"
|
|
source_details: Optional[str] = None
|
|
request_text: Optional[str] = None
|
|
priority: Optional[str] = "normal"
|
|
notes: Optional[str] = None
|
|
|
|
|
|
class DSRUpdate(BaseModel):
|
|
priority: Optional[str] = None
|
|
notes: Optional[str] = None
|
|
internal_notes: Optional[str] = None
|
|
assigned_to: Optional[str] = None
|
|
request_text: Optional[str] = None
|
|
affected_systems: Optional[List[str]] = None
|
|
erasure_checklist: Optional[List[Dict[str, Any]]] = None
|
|
rectification_details: Optional[Dict[str, Any]] = None
|
|
objection_details: Optional[Dict[str, Any]] = None
|
|
|
|
|
|
class StatusChange(BaseModel):
|
|
status: str
|
|
comment: Optional[str] = None
|
|
|
|
|
|
class VerifyIdentity(BaseModel):
|
|
method: str
|
|
notes: Optional[str] = None
|
|
document_ref: Optional[str] = None
|
|
|
|
|
|
class AssignRequest(BaseModel):
|
|
assignee_id: str
|
|
|
|
|
|
class ExtendDeadline(BaseModel):
|
|
reason: str
|
|
days: Optional[int] = 60
|
|
|
|
|
|
class CompleteDSR(BaseModel):
|
|
summary: Optional[str] = None
|
|
result_data: Optional[Dict[str, Any]] = None
|
|
|
|
|
|
class RejectDSR(BaseModel):
|
|
reason: str
|
|
legal_basis: Optional[str] = None
|
|
|
|
|
|
class SendCommunication(BaseModel):
|
|
communication_type: str = "outgoing"
|
|
channel: str = "email"
|
|
subject: Optional[str] = None
|
|
content: str
|
|
template_used: Optional[str] = None
|
|
|
|
|
|
class UpdateExceptionCheck(BaseModel):
|
|
applies: bool
|
|
notes: Optional[str] = None
|
|
|
|
|
|
class CreateTemplateVersion(BaseModel):
|
|
version: str = "1.0"
|
|
language: Optional[str] = "de"
|
|
subject: str
|
|
body_html: str
|
|
body_text: Optional[str] = None
|
|
|
|
|
|
# =============================================================================
|
|
# Helpers
|
|
# =============================================================================
|
|
|
|
def _get_tenant(x_tenant_id: Optional[str] = Header(None, alias='X-Tenant-ID')) -> str:
|
|
return x_tenant_id or DEFAULT_TENANT
|
|
|
|
|
|
def _generate_request_number(db: Session, tenant_id: str) -> str:
|
|
"""Generate next request number: DSR-YYYY-NNNNNN"""
|
|
year = datetime.now(timezone.utc).year
|
|
try:
|
|
result = db.execute(text("SELECT nextval('compliance_dsr_request_number_seq')"))
|
|
seq = result.scalar()
|
|
except Exception:
|
|
# Fallback for non-PostgreSQL (e.g. SQLite tests): count existing + 1
|
|
count = db.query(DSRRequestDB).count()
|
|
seq = count + 1
|
|
return f"DSR-{year}-{str(seq).zfill(6)}"
|
|
|
|
|
|
def _record_history(db: Session, dsr: DSRRequestDB, new_status: str, changed_by: str = "system", comment: str = None):
|
|
"""Record status change in history."""
|
|
entry = DSRStatusHistoryDB(
|
|
tenant_id=dsr.tenant_id,
|
|
dsr_id=dsr.id,
|
|
previous_status=dsr.status,
|
|
new_status=new_status,
|
|
changed_by=changed_by,
|
|
comment=comment,
|
|
)
|
|
db.add(entry)
|
|
|
|
|
|
def _dsr_to_dict(dsr: DSRRequestDB) -> dict:
|
|
"""Convert DSR DB record to API response dict."""
|
|
return {
|
|
"id": str(dsr.id),
|
|
"tenant_id": str(dsr.tenant_id),
|
|
"request_number": dsr.request_number,
|
|
"request_type": dsr.request_type,
|
|
"status": dsr.status,
|
|
"priority": dsr.priority,
|
|
"requester_name": dsr.requester_name,
|
|
"requester_email": dsr.requester_email,
|
|
"requester_phone": dsr.requester_phone,
|
|
"requester_address": dsr.requester_address,
|
|
"requester_customer_id": dsr.requester_customer_id,
|
|
"source": dsr.source,
|
|
"source_details": dsr.source_details,
|
|
"request_text": dsr.request_text,
|
|
"notes": dsr.notes,
|
|
"internal_notes": dsr.internal_notes,
|
|
"received_at": dsr.received_at.isoformat() if dsr.received_at else None,
|
|
"deadline_at": dsr.deadline_at.isoformat() if dsr.deadline_at else None,
|
|
"extended_deadline_at": dsr.extended_deadline_at.isoformat() if dsr.extended_deadline_at else None,
|
|
"extension_reason": dsr.extension_reason,
|
|
"extension_approved_by": dsr.extension_approved_by,
|
|
"extension_approved_at": dsr.extension_approved_at.isoformat() if dsr.extension_approved_at else None,
|
|
"identity_verified": dsr.identity_verified,
|
|
"verification_method": dsr.verification_method,
|
|
"verified_at": dsr.verified_at.isoformat() if dsr.verified_at else None,
|
|
"verified_by": dsr.verified_by,
|
|
"verification_notes": dsr.verification_notes,
|
|
"verification_document_ref": dsr.verification_document_ref,
|
|
"assigned_to": dsr.assigned_to,
|
|
"assigned_at": dsr.assigned_at.isoformat() if dsr.assigned_at else None,
|
|
"assigned_by": dsr.assigned_by,
|
|
"completed_at": dsr.completed_at.isoformat() if dsr.completed_at else None,
|
|
"completion_notes": dsr.completion_notes,
|
|
"rejection_reason": dsr.rejection_reason,
|
|
"rejection_legal_basis": dsr.rejection_legal_basis,
|
|
"erasure_checklist": dsr.erasure_checklist or [],
|
|
"data_export": dsr.data_export or {},
|
|
"rectification_details": dsr.rectification_details or {},
|
|
"objection_details": dsr.objection_details or {},
|
|
"affected_systems": dsr.affected_systems or [],
|
|
"created_at": dsr.created_at.isoformat() if dsr.created_at else None,
|
|
"updated_at": dsr.updated_at.isoformat() if dsr.updated_at else None,
|
|
"created_by": dsr.created_by,
|
|
"updated_by": dsr.updated_by,
|
|
}
|
|
|
|
|
|
def _get_dsr_or_404(db: Session, dsr_id: str, tenant_id: str) -> DSRRequestDB:
|
|
"""Get DSR by ID or raise 404."""
|
|
try:
|
|
uid = uuid.UUID(dsr_id)
|
|
except ValueError:
|
|
raise HTTPException(status_code=400, detail="Invalid DSR ID format")
|
|
dsr = db.query(DSRRequestDB).filter(
|
|
DSRRequestDB.id == uid,
|
|
DSRRequestDB.tenant_id == uuid.UUID(tenant_id),
|
|
).first()
|
|
if not dsr:
|
|
raise HTTPException(status_code=404, detail="DSR not found")
|
|
return dsr
|
|
|
|
|
|
# =============================================================================
|
|
# DSR CRUD Endpoints
|
|
# =============================================================================
|
|
|
|
@router.post("")
|
|
async def create_dsr(
|
|
body: DSRCreate,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Erstellt eine neue Betroffenenanfrage."""
|
|
if body.request_type not in VALID_REQUEST_TYPES:
|
|
raise HTTPException(status_code=400, detail=f"Invalid request_type. Must be one of: {VALID_REQUEST_TYPES}")
|
|
if body.source not in VALID_SOURCES:
|
|
raise HTTPException(status_code=400, detail=f"Invalid source. Must be one of: {VALID_SOURCES}")
|
|
if body.priority and body.priority not in VALID_PRIORITIES:
|
|
raise HTTPException(status_code=400, detail=f"Invalid priority. Must be one of: {VALID_PRIORITIES}")
|
|
|
|
now = datetime.now(timezone.utc)
|
|
deadline_days = DEADLINE_DAYS.get(body.request_type, 30)
|
|
request_number = _generate_request_number(db, tenant_id)
|
|
|
|
dsr = DSRRequestDB(
|
|
tenant_id=uuid.UUID(tenant_id),
|
|
request_number=request_number,
|
|
request_type=body.request_type,
|
|
status="intake",
|
|
priority=body.priority or "normal",
|
|
requester_name=body.requester_name,
|
|
requester_email=body.requester_email,
|
|
requester_phone=body.requester_phone,
|
|
requester_address=body.requester_address,
|
|
requester_customer_id=body.requester_customer_id,
|
|
source=body.source,
|
|
source_details=body.source_details,
|
|
request_text=body.request_text,
|
|
notes=body.notes,
|
|
received_at=now,
|
|
deadline_at=now + timedelta(days=deadline_days),
|
|
created_at=now,
|
|
updated_at=now,
|
|
)
|
|
db.add(dsr)
|
|
db.flush() # Ensure dsr.id is assigned before referencing
|
|
|
|
# Initial history entry
|
|
history = DSRStatusHistoryDB(
|
|
tenant_id=uuid.UUID(tenant_id),
|
|
dsr_id=dsr.id,
|
|
previous_status=None,
|
|
new_status="intake",
|
|
changed_by="system",
|
|
comment="DSR erstellt",
|
|
)
|
|
db.add(history)
|
|
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.get("")
|
|
async def list_dsrs(
|
|
status: Optional[str] = Query(None),
|
|
request_type: Optional[str] = Query(None),
|
|
assigned_to: Optional[str] = Query(None),
|
|
priority: Optional[str] = Query(None),
|
|
overdue_only: bool = Query(False),
|
|
search: Optional[str] = Query(None),
|
|
from_date: Optional[str] = Query(None),
|
|
to_date: Optional[str] = Query(None),
|
|
limit: int = Query(20, ge=1, le=100),
|
|
offset: int = Query(0, ge=0),
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Liste aller DSRs mit Filtern."""
|
|
query = db.query(DSRRequestDB).filter(
|
|
DSRRequestDB.tenant_id == uuid.UUID(tenant_id),
|
|
)
|
|
|
|
if status:
|
|
query = query.filter(DSRRequestDB.status == status)
|
|
if request_type:
|
|
query = query.filter(DSRRequestDB.request_type == request_type)
|
|
if assigned_to:
|
|
query = query.filter(DSRRequestDB.assigned_to == assigned_to)
|
|
if priority:
|
|
query = query.filter(DSRRequestDB.priority == priority)
|
|
if overdue_only:
|
|
query = query.filter(
|
|
DSRRequestDB.deadline_at < datetime.now(timezone.utc),
|
|
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
|
|
)
|
|
if search:
|
|
search_term = f"%{search.lower()}%"
|
|
query = query.filter(
|
|
or_(
|
|
func.lower(func.coalesce(DSRRequestDB.requester_name, '')).like(search_term),
|
|
func.lower(func.coalesce(DSRRequestDB.requester_email, '')).like(search_term),
|
|
func.lower(func.coalesce(DSRRequestDB.request_number, '')).like(search_term),
|
|
func.lower(func.coalesce(DSRRequestDB.request_text, '')).like(search_term),
|
|
)
|
|
)
|
|
if from_date:
|
|
query = query.filter(DSRRequestDB.received_at >= from_date)
|
|
if to_date:
|
|
query = query.filter(DSRRequestDB.received_at <= to_date)
|
|
|
|
total = query.count()
|
|
dsrs = query.order_by(DSRRequestDB.created_at.desc()).offset(offset).limit(limit).all()
|
|
|
|
return {
|
|
"requests": [_dsr_to_dict(d) for d in dsrs],
|
|
"total": total,
|
|
"limit": limit,
|
|
"offset": offset,
|
|
}
|
|
|
|
|
|
@router.get("/stats")
|
|
async def get_dsr_stats(
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Dashboard-Statistiken fuer DSRs."""
|
|
tid = uuid.UUID(tenant_id)
|
|
base = db.query(DSRRequestDB).filter(DSRRequestDB.tenant_id == tid)
|
|
|
|
total = base.count()
|
|
|
|
# By status
|
|
by_status = {}
|
|
for s in VALID_STATUSES:
|
|
by_status[s] = base.filter(DSRRequestDB.status == s).count()
|
|
|
|
# By type
|
|
by_type = {}
|
|
for t in VALID_REQUEST_TYPES:
|
|
by_type[t] = base.filter(DSRRequestDB.request_type == t).count()
|
|
|
|
# Overdue
|
|
now = datetime.now(timezone.utc)
|
|
overdue = base.filter(
|
|
DSRRequestDB.deadline_at < now,
|
|
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
|
|
).count()
|
|
|
|
# Due this week
|
|
week_from_now = now + timedelta(days=7)
|
|
due_this_week = base.filter(
|
|
DSRRequestDB.deadline_at >= now,
|
|
DSRRequestDB.deadline_at <= week_from_now,
|
|
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
|
|
).count()
|
|
|
|
# Completed this month
|
|
month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
|
|
completed_this_month = base.filter(
|
|
DSRRequestDB.status == "completed",
|
|
DSRRequestDB.completed_at >= month_start,
|
|
).count()
|
|
|
|
# Average processing days (completed DSRs)
|
|
completed = base.filter(DSRRequestDB.status == "completed", DSRRequestDB.completed_at.isnot(None)).all()
|
|
if completed:
|
|
total_days = sum(
|
|
(d.completed_at - d.received_at).days for d in completed if d.completed_at and d.received_at
|
|
)
|
|
avg_days = total_days / len(completed)
|
|
else:
|
|
avg_days = 0
|
|
|
|
return {
|
|
"total": total,
|
|
"by_status": by_status,
|
|
"by_type": by_type,
|
|
"overdue": overdue,
|
|
"due_this_week": due_this_week,
|
|
"average_processing_days": round(avg_days, 1),
|
|
"completed_this_month": completed_this_month,
|
|
}
|
|
|
|
|
|
# =============================================================================
|
|
# Export
|
|
# =============================================================================
|
|
|
|
@router.get("/export")
|
|
async def export_dsrs(
|
|
format: str = Query("csv", pattern="^(csv|json)$"),
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Exportiert alle DSRs als CSV oder JSON."""
|
|
tid = uuid.UUID(tenant_id)
|
|
dsrs = db.query(DSRRequestDB).filter(
|
|
DSRRequestDB.tenant_id == tid,
|
|
).order_by(DSRRequestDB.created_at.desc()).all()
|
|
|
|
if format == "json":
|
|
return {
|
|
"exported_at": datetime.now(timezone.utc).isoformat(),
|
|
"total": len(dsrs),
|
|
"requests": [_dsr_to_dict(d) for d in dsrs],
|
|
}
|
|
|
|
# CSV export (semicolon-separated, matching Go format + extended fields)
|
|
output = io.StringIO()
|
|
writer = csv.writer(output, delimiter=';', quoting=csv.QUOTE_MINIMAL)
|
|
writer.writerow([
|
|
"ID", "Referenznummer", "Typ", "Name", "E-Mail", "Status",
|
|
"Prioritaet", "Eingegangen", "Frist", "Abgeschlossen", "Quelle", "Zugewiesen",
|
|
])
|
|
|
|
for dsr in dsrs:
|
|
writer.writerow([
|
|
str(dsr.id),
|
|
dsr.request_number or "",
|
|
dsr.request_type or "",
|
|
dsr.requester_name or "",
|
|
dsr.requester_email or "",
|
|
dsr.status or "",
|
|
dsr.priority or "",
|
|
dsr.received_at.strftime("%Y-%m-%d") if dsr.received_at else "",
|
|
dsr.deadline_at.strftime("%Y-%m-%d") if dsr.deadline_at else "",
|
|
dsr.completed_at.strftime("%Y-%m-%d") if dsr.completed_at else "",
|
|
dsr.source or "",
|
|
dsr.assigned_to or "",
|
|
])
|
|
|
|
output.seek(0)
|
|
return StreamingResponse(
|
|
output,
|
|
media_type="text/csv; charset=utf-8",
|
|
headers={"Content-Disposition": "attachment; filename=dsr_export.csv"},
|
|
)
|
|
|
|
|
|
# =============================================================================
|
|
# Deadline Processing (MUST be before /{dsr_id} to avoid path conflicts)
|
|
# =============================================================================
|
|
|
|
@router.post("/deadlines/process")
|
|
async def process_deadlines(
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Verarbeitet Fristen und markiert ueberfaellige DSRs."""
|
|
now = datetime.now(timezone.utc)
|
|
tid = uuid.UUID(tenant_id)
|
|
|
|
overdue = db.query(DSRRequestDB).filter(
|
|
DSRRequestDB.tenant_id == tid,
|
|
DSRRequestDB.status.notin_(["completed", "rejected", "cancelled"]),
|
|
or_(
|
|
and_(DSRRequestDB.extended_deadline_at.isnot(None), DSRRequestDB.extended_deadline_at < now),
|
|
and_(DSRRequestDB.extended_deadline_at.is_(None), DSRRequestDB.deadline_at < now),
|
|
),
|
|
).all()
|
|
|
|
processed = []
|
|
for dsr in overdue:
|
|
processed.append({
|
|
"id": str(dsr.id),
|
|
"request_number": dsr.request_number,
|
|
"status": dsr.status,
|
|
"deadline_at": dsr.deadline_at.isoformat() if dsr.deadline_at else None,
|
|
"extended_deadline_at": dsr.extended_deadline_at.isoformat() if dsr.extended_deadline_at else None,
|
|
"days_overdue": (now - (dsr.extended_deadline_at or dsr.deadline_at)).days,
|
|
})
|
|
|
|
return {
|
|
"processed": len(processed),
|
|
"overdue_requests": processed,
|
|
}
|
|
|
|
|
|
# =============================================================================
|
|
# DSR Templates (MUST be before /{dsr_id} to avoid path conflicts)
|
|
# =============================================================================
|
|
|
|
@router.get("/templates")
|
|
async def get_templates(
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Gibt alle DSR-Vorlagen zurueck."""
|
|
templates = db.query(DSRTemplateDB).filter(
|
|
DSRTemplateDB.tenant_id == uuid.UUID(tenant_id),
|
|
).order_by(DSRTemplateDB.template_type).all()
|
|
|
|
return [
|
|
{
|
|
"id": str(t.id),
|
|
"name": t.name,
|
|
"template_type": t.template_type,
|
|
"request_type": t.request_type,
|
|
"language": t.language,
|
|
"is_active": t.is_active,
|
|
"created_at": t.created_at.isoformat() if t.created_at else None,
|
|
"updated_at": t.updated_at.isoformat() if t.updated_at else None,
|
|
}
|
|
for t in templates
|
|
]
|
|
|
|
|
|
@router.get("/templates/published")
|
|
async def get_published_templates(
|
|
request_type: Optional[str] = Query(None),
|
|
language: str = Query("de"),
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Gibt publizierte Vorlagen zurueck."""
|
|
query = db.query(DSRTemplateDB).filter(
|
|
DSRTemplateDB.tenant_id == uuid.UUID(tenant_id),
|
|
DSRTemplateDB.is_active,
|
|
DSRTemplateDB.language == language,
|
|
)
|
|
if request_type:
|
|
query = query.filter(
|
|
or_(
|
|
DSRTemplateDB.request_type == request_type,
|
|
DSRTemplateDB.request_type.is_(None),
|
|
)
|
|
)
|
|
|
|
templates = query.all()
|
|
result = []
|
|
for t in templates:
|
|
latest = db.query(DSRTemplateVersionDB).filter(
|
|
DSRTemplateVersionDB.template_id == t.id,
|
|
DSRTemplateVersionDB.status == "published",
|
|
).order_by(DSRTemplateVersionDB.created_at.desc()).first()
|
|
|
|
result.append({
|
|
"id": str(t.id),
|
|
"name": t.name,
|
|
"template_type": t.template_type,
|
|
"request_type": t.request_type,
|
|
"language": t.language,
|
|
"latest_version": {
|
|
"id": str(latest.id),
|
|
"version": latest.version,
|
|
"subject": latest.subject,
|
|
"body_html": latest.body_html,
|
|
"body_text": latest.body_text,
|
|
} if latest else None,
|
|
})
|
|
|
|
return result
|
|
|
|
|
|
@router.get("/templates/{template_id}/versions")
|
|
async def get_template_versions(
|
|
template_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Gibt alle Versionen einer Vorlage zurueck."""
|
|
try:
|
|
tid = uuid.UUID(template_id)
|
|
except ValueError:
|
|
raise HTTPException(status_code=400, detail="Invalid template ID")
|
|
|
|
template = db.query(DSRTemplateDB).filter(
|
|
DSRTemplateDB.id == tid,
|
|
DSRTemplateDB.tenant_id == uuid.UUID(tenant_id),
|
|
).first()
|
|
if not template:
|
|
raise HTTPException(status_code=404, detail="Template not found")
|
|
|
|
versions = db.query(DSRTemplateVersionDB).filter(
|
|
DSRTemplateVersionDB.template_id == tid,
|
|
).order_by(DSRTemplateVersionDB.created_at.desc()).all()
|
|
|
|
return [
|
|
{
|
|
"id": str(v.id),
|
|
"template_id": str(v.template_id),
|
|
"version": v.version,
|
|
"subject": v.subject,
|
|
"body_html": v.body_html,
|
|
"body_text": v.body_text,
|
|
"status": v.status,
|
|
"published_at": v.published_at.isoformat() if v.published_at else None,
|
|
"published_by": v.published_by,
|
|
"created_at": v.created_at.isoformat() if v.created_at else None,
|
|
"created_by": v.created_by,
|
|
}
|
|
for v in versions
|
|
]
|
|
|
|
|
|
@router.post("/templates/{template_id}/versions")
|
|
async def create_template_version(
|
|
template_id: str,
|
|
body: CreateTemplateVersion,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Erstellt eine neue Version einer Vorlage."""
|
|
try:
|
|
tid = uuid.UUID(template_id)
|
|
except ValueError:
|
|
raise HTTPException(status_code=400, detail="Invalid template ID")
|
|
|
|
template = db.query(DSRTemplateDB).filter(
|
|
DSRTemplateDB.id == tid,
|
|
DSRTemplateDB.tenant_id == uuid.UUID(tenant_id),
|
|
).first()
|
|
if not template:
|
|
raise HTTPException(status_code=404, detail="Template not found")
|
|
|
|
version = DSRTemplateVersionDB(
|
|
template_id=tid,
|
|
version=body.version,
|
|
subject=body.subject,
|
|
body_html=body.body_html,
|
|
body_text=body.body_text,
|
|
status="draft",
|
|
)
|
|
db.add(version)
|
|
db.commit()
|
|
db.refresh(version)
|
|
|
|
return {
|
|
"id": str(version.id),
|
|
"template_id": str(version.template_id),
|
|
"version": version.version,
|
|
"subject": version.subject,
|
|
"body_html": version.body_html,
|
|
"body_text": version.body_text,
|
|
"status": version.status,
|
|
"created_at": version.created_at.isoformat() if version.created_at else None,
|
|
}
|
|
|
|
|
|
@router.put("/template-versions/{version_id}/publish")
|
|
async def publish_template_version(
|
|
version_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Veroeffentlicht eine Vorlagen-Version."""
|
|
try:
|
|
vid = uuid.UUID(version_id)
|
|
except ValueError:
|
|
raise HTTPException(status_code=400, detail="Invalid version ID")
|
|
|
|
version = db.query(DSRTemplateVersionDB).filter(
|
|
DSRTemplateVersionDB.id == vid,
|
|
).first()
|
|
if not version:
|
|
raise HTTPException(status_code=404, detail="Version not found")
|
|
|
|
now = datetime.now(timezone.utc)
|
|
version.status = "published"
|
|
version.published_at = now
|
|
version.published_by = "admin"
|
|
db.commit()
|
|
db.refresh(version)
|
|
|
|
return {
|
|
"id": str(version.id),
|
|
"template_id": str(version.template_id),
|
|
"version": version.version,
|
|
"status": version.status,
|
|
"published_at": version.published_at.isoformat(),
|
|
"published_by": version.published_by,
|
|
}
|
|
|
|
|
|
# =============================================================================
|
|
# Single DSR Endpoints (parameterized — MUST come after static paths)
|
|
# =============================================================================
|
|
|
|
@router.get("/{dsr_id}")
|
|
async def get_dsr(
|
|
dsr_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Detail einer Betroffenenanfrage."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.put("/{dsr_id}")
|
|
async def update_dsr(
|
|
dsr_id: str,
|
|
body: DSRUpdate,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Aktualisiert eine Betroffenenanfrage."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
|
|
if body.priority is not None:
|
|
if body.priority not in VALID_PRIORITIES:
|
|
raise HTTPException(status_code=400, detail=f"Invalid priority: {body.priority}")
|
|
dsr.priority = body.priority
|
|
if body.notes is not None:
|
|
dsr.notes = body.notes
|
|
if body.internal_notes is not None:
|
|
dsr.internal_notes = body.internal_notes
|
|
if body.assigned_to is not None:
|
|
dsr.assigned_to = body.assigned_to
|
|
dsr.assigned_at = datetime.now(timezone.utc)
|
|
if body.request_text is not None:
|
|
dsr.request_text = body.request_text
|
|
if body.affected_systems is not None:
|
|
dsr.affected_systems = body.affected_systems
|
|
if body.erasure_checklist is not None:
|
|
dsr.erasure_checklist = body.erasure_checklist
|
|
if body.rectification_details is not None:
|
|
dsr.rectification_details = body.rectification_details
|
|
if body.objection_details is not None:
|
|
dsr.objection_details = body.objection_details
|
|
|
|
dsr.updated_at = datetime.now(timezone.utc)
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.delete("/{dsr_id}")
|
|
async def delete_dsr(
|
|
dsr_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Storniert eine DSR (Soft Delete → Status cancelled)."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
if dsr.status in ("completed", "cancelled"):
|
|
raise HTTPException(status_code=400, detail="DSR already completed or cancelled")
|
|
|
|
_record_history(db, dsr, "cancelled", comment="DSR storniert")
|
|
dsr.status = "cancelled"
|
|
dsr.updated_at = datetime.now(timezone.utc)
|
|
db.commit()
|
|
return {"success": True, "message": "DSR cancelled"}
|
|
|
|
|
|
# =============================================================================
|
|
# Workflow Actions
|
|
# =============================================================================
|
|
|
|
@router.post("/{dsr_id}/status")
|
|
async def change_status(
|
|
dsr_id: str,
|
|
body: StatusChange,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Aendert den Status einer DSR."""
|
|
if body.status not in VALID_STATUSES:
|
|
raise HTTPException(status_code=400, detail=f"Invalid status: {body.status}")
|
|
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
_record_history(db, dsr, body.status, comment=body.comment)
|
|
dsr.status = body.status
|
|
dsr.updated_at = datetime.now(timezone.utc)
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.post("/{dsr_id}/verify-identity")
|
|
async def verify_identity(
|
|
dsr_id: str,
|
|
body: VerifyIdentity,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Verifiziert die Identitaet des Antragstellers."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
now = datetime.now(timezone.utc)
|
|
|
|
dsr.identity_verified = True
|
|
dsr.verification_method = body.method
|
|
dsr.verified_at = now
|
|
dsr.verified_by = "admin"
|
|
dsr.verification_notes = body.notes
|
|
dsr.verification_document_ref = body.document_ref
|
|
|
|
# Auto-advance to processing if in identity_verification
|
|
if dsr.status == "identity_verification":
|
|
_record_history(db, dsr, "processing", comment="Identitaet verifiziert")
|
|
dsr.status = "processing"
|
|
elif dsr.status == "intake":
|
|
_record_history(db, dsr, "identity_verification", comment="Identitaet verifiziert")
|
|
dsr.status = "identity_verification"
|
|
|
|
dsr.updated_at = now
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.post("/{dsr_id}/assign")
|
|
async def assign_dsr(
|
|
dsr_id: str,
|
|
body: AssignRequest,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Weist eine DSR einem Bearbeiter zu."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
dsr.assigned_to = body.assignee_id
|
|
dsr.assigned_at = datetime.now(timezone.utc)
|
|
dsr.assigned_by = "admin"
|
|
dsr.updated_at = datetime.now(timezone.utc)
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.post("/{dsr_id}/extend")
|
|
async def extend_deadline(
|
|
dsr_id: str,
|
|
body: ExtendDeadline,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Verlaengert die Bearbeitungsfrist (Art. 12 Abs. 3 DSGVO)."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
if dsr.status in ("completed", "rejected", "cancelled"):
|
|
raise HTTPException(status_code=400, detail="Cannot extend deadline for closed DSR")
|
|
|
|
now = datetime.now(timezone.utc)
|
|
current_deadline = dsr.extended_deadline_at or dsr.deadline_at
|
|
new_deadline = current_deadline + timedelta(days=body.days or 60)
|
|
|
|
dsr.extended_deadline_at = new_deadline
|
|
dsr.extension_reason = body.reason
|
|
dsr.extension_approved_by = "admin"
|
|
dsr.extension_approved_at = now
|
|
dsr.updated_at = now
|
|
|
|
_record_history(db, dsr, dsr.status, comment=f"Frist verlaengert: {body.reason}")
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.post("/{dsr_id}/complete")
|
|
async def complete_dsr(
|
|
dsr_id: str,
|
|
body: CompleteDSR,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Schliesst eine DSR erfolgreich ab."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
if dsr.status in ("completed", "cancelled"):
|
|
raise HTTPException(status_code=400, detail="DSR already completed or cancelled")
|
|
|
|
now = datetime.now(timezone.utc)
|
|
_record_history(db, dsr, "completed", comment=body.summary)
|
|
dsr.status = "completed"
|
|
dsr.completed_at = now
|
|
dsr.completion_notes = body.summary
|
|
if body.result_data:
|
|
dsr.data_export = body.result_data
|
|
dsr.updated_at = now
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
@router.post("/{dsr_id}/reject")
|
|
async def reject_dsr(
|
|
dsr_id: str,
|
|
body: RejectDSR,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Lehnt eine DSR mit Rechtsgrundlage ab."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
if dsr.status in ("completed", "rejected", "cancelled"):
|
|
raise HTTPException(status_code=400, detail="DSR already closed")
|
|
|
|
now = datetime.now(timezone.utc)
|
|
_record_history(db, dsr, "rejected", comment=f"{body.reason} ({body.legal_basis})")
|
|
dsr.status = "rejected"
|
|
dsr.rejection_reason = body.reason
|
|
dsr.rejection_legal_basis = body.legal_basis
|
|
dsr.completed_at = now
|
|
dsr.updated_at = now
|
|
db.commit()
|
|
db.refresh(dsr)
|
|
return _dsr_to_dict(dsr)
|
|
|
|
|
|
# =============================================================================
|
|
# History & Communications
|
|
# =============================================================================
|
|
|
|
@router.get("/{dsr_id}/history")
|
|
async def get_history(
|
|
dsr_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Gibt die Status-Historie zurueck."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
entries = db.query(DSRStatusHistoryDB).filter(
|
|
DSRStatusHistoryDB.dsr_id == dsr.id,
|
|
).order_by(DSRStatusHistoryDB.created_at.desc()).all()
|
|
|
|
return [
|
|
{
|
|
"id": str(e.id),
|
|
"dsr_id": str(e.dsr_id),
|
|
"previous_status": e.previous_status,
|
|
"new_status": e.new_status,
|
|
"changed_by": e.changed_by,
|
|
"comment": e.comment,
|
|
"created_at": e.created_at.isoformat() if e.created_at else None,
|
|
}
|
|
for e in entries
|
|
]
|
|
|
|
|
|
@router.get("/{dsr_id}/communications")
|
|
async def get_communications(
|
|
dsr_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Gibt die Kommunikationshistorie zurueck."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
comms = db.query(DSRCommunicationDB).filter(
|
|
DSRCommunicationDB.dsr_id == dsr.id,
|
|
).order_by(DSRCommunicationDB.created_at.desc()).all()
|
|
|
|
return [
|
|
{
|
|
"id": str(c.id),
|
|
"dsr_id": str(c.dsr_id),
|
|
"communication_type": c.communication_type,
|
|
"channel": c.channel,
|
|
"subject": c.subject,
|
|
"content": c.content,
|
|
"template_used": c.template_used,
|
|
"attachments": c.attachments or [],
|
|
"sent_at": c.sent_at.isoformat() if c.sent_at else None,
|
|
"sent_by": c.sent_by,
|
|
"received_at": c.received_at.isoformat() if c.received_at else None,
|
|
"created_at": c.created_at.isoformat() if c.created_at else None,
|
|
"created_by": c.created_by,
|
|
}
|
|
for c in comms
|
|
]
|
|
|
|
|
|
@router.post("/{dsr_id}/communicate")
|
|
async def send_communication(
|
|
dsr_id: str,
|
|
body: SendCommunication,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Sendet eine Kommunikation."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
now = datetime.now(timezone.utc)
|
|
|
|
comm = DSRCommunicationDB(
|
|
tenant_id=uuid.UUID(tenant_id),
|
|
dsr_id=dsr.id,
|
|
communication_type=body.communication_type,
|
|
channel=body.channel,
|
|
subject=body.subject,
|
|
content=body.content,
|
|
template_used=body.template_used,
|
|
sent_at=now if body.communication_type == "outgoing" else None,
|
|
sent_by="admin" if body.communication_type == "outgoing" else None,
|
|
received_at=now if body.communication_type == "incoming" else None,
|
|
created_at=now,
|
|
)
|
|
db.add(comm)
|
|
db.commit()
|
|
db.refresh(comm)
|
|
|
|
return {
|
|
"id": str(comm.id),
|
|
"dsr_id": str(comm.dsr_id),
|
|
"communication_type": comm.communication_type,
|
|
"channel": comm.channel,
|
|
"subject": comm.subject,
|
|
"content": comm.content,
|
|
"sent_at": comm.sent_at.isoformat() if comm.sent_at else None,
|
|
"created_at": comm.created_at.isoformat() if comm.created_at else None,
|
|
}
|
|
|
|
|
|
# =============================================================================
|
|
# Exception Checks (Art. 17)
|
|
# =============================================================================
|
|
|
|
@router.get("/{dsr_id}/exception-checks")
|
|
async def get_exception_checks(
|
|
dsr_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Gibt die Art. 17(3) Ausnahmepruefungen zurueck."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
checks = db.query(DSRExceptionCheckDB).filter(
|
|
DSRExceptionCheckDB.dsr_id == dsr.id,
|
|
).order_by(DSRExceptionCheckDB.check_code).all()
|
|
|
|
return [
|
|
{
|
|
"id": str(c.id),
|
|
"dsr_id": str(c.dsr_id),
|
|
"check_code": c.check_code,
|
|
"article": c.article,
|
|
"label": c.label,
|
|
"description": c.description,
|
|
"applies": c.applies,
|
|
"notes": c.notes,
|
|
"checked_by": c.checked_by,
|
|
"checked_at": c.checked_at.isoformat() if c.checked_at else None,
|
|
}
|
|
for c in checks
|
|
]
|
|
|
|
|
|
@router.post("/{dsr_id}/exception-checks/init")
|
|
async def init_exception_checks(
|
|
dsr_id: str,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Initialisiert die Art. 17(3) Ausnahmepruefungen fuer eine Loeschanfrage."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
if dsr.request_type != "erasure":
|
|
raise HTTPException(status_code=400, detail="Exception checks only for erasure requests")
|
|
|
|
# Check if already initialized
|
|
existing = db.query(DSRExceptionCheckDB).filter(DSRExceptionCheckDB.dsr_id == dsr.id).count()
|
|
if existing > 0:
|
|
raise HTTPException(status_code=400, detail="Exception checks already initialized")
|
|
|
|
checks = []
|
|
for exc in ART17_EXCEPTIONS:
|
|
check = DSRExceptionCheckDB(
|
|
tenant_id=uuid.UUID(tenant_id),
|
|
dsr_id=dsr.id,
|
|
check_code=exc["check_code"],
|
|
article=exc["article"],
|
|
label=exc["label"],
|
|
description=exc["description"],
|
|
)
|
|
db.add(check)
|
|
checks.append(check)
|
|
|
|
db.commit()
|
|
return [
|
|
{
|
|
"id": str(c.id),
|
|
"dsr_id": str(c.dsr_id),
|
|
"check_code": c.check_code,
|
|
"article": c.article,
|
|
"label": c.label,
|
|
"description": c.description,
|
|
"applies": c.applies,
|
|
"notes": c.notes,
|
|
}
|
|
for c in checks
|
|
]
|
|
|
|
|
|
@router.put("/{dsr_id}/exception-checks/{check_id}")
|
|
async def update_exception_check(
|
|
dsr_id: str,
|
|
check_id: str,
|
|
body: UpdateExceptionCheck,
|
|
tenant_id: str = Depends(_get_tenant),
|
|
db: Session = Depends(get_db),
|
|
):
|
|
"""Aktualisiert eine einzelne Ausnahmepruefung."""
|
|
dsr = _get_dsr_or_404(db, dsr_id, tenant_id)
|
|
try:
|
|
cid = uuid.UUID(check_id)
|
|
except ValueError:
|
|
raise HTTPException(status_code=400, detail="Invalid check ID")
|
|
|
|
check = db.query(DSRExceptionCheckDB).filter(
|
|
DSRExceptionCheckDB.id == cid,
|
|
DSRExceptionCheckDB.dsr_id == dsr.id,
|
|
).first()
|
|
if not check:
|
|
raise HTTPException(status_code=404, detail="Exception check not found")
|
|
|
|
check.applies = body.applies
|
|
check.notes = body.notes
|
|
check.checked_by = "admin"
|
|
check.checked_at = datetime.now(timezone.utc)
|
|
db.commit()
|
|
db.refresh(check)
|
|
|
|
return {
|
|
"id": str(check.id),
|
|
"dsr_id": str(check.dsr_id),
|
|
"check_code": check.check_code,
|
|
"article": check.article,
|
|
"label": check.label,
|
|
"description": check.description,
|
|
"applies": check.applies,
|
|
"notes": check.notes,
|
|
"checked_by": check.checked_by,
|
|
"checked_at": check.checked_at.isoformat() if check.checked_at else None,
|
|
}
|