feat(sdk): Multi-Tenancy, Versionierung, Change-Requests, Dokumentengenerierung (Phase 1-6)
All checks were successful
CI / go-lint (push) Has been skipped
CI / python-lint (push) Has been skipped
CI / nodejs-lint (push) Has been skipped
CI / test-go-ai-compliance (push) Successful in 32s
CI / test-python-backend-compliance (push) Successful in 30s
CI / test-python-document-crawler (push) Successful in 21s
CI / test-python-dsms-gateway (push) Successful in 18s
All checks were successful
CI / go-lint (push) Has been skipped
CI / python-lint (push) Has been skipped
CI / nodejs-lint (push) Has been skipped
CI / test-go-ai-compliance (push) Successful in 32s
CI / test-python-backend-compliance (push) Successful in 30s
CI / test-python-document-crawler (push) Successful in 21s
CI / test-python-dsms-gateway (push) Successful in 18s
6-Phasen-Implementation fuer cloud-faehiges, mandantenfaehiges Compliance SDK:
Phase 1: Multi-Tenancy Fix
- Shared tenant_utils.py Dependency (UUID-Validierung, kein "default" mehr)
- VVT tenant_id Column + tenant-scoped Queries
- DSFA/Vendor DEFAULT_TENANT_ID von "default" auf UUID migriert
- Migration 035
Phase 2: Stammdaten-Erweiterung
- Company Profile um JSONB-Felder erweitert (processing_systems, ai_systems, technical_contacts)
- Regulierungs-Flags (NIS2, AI Act, ISO 27001)
- GET /template-context Endpoint
- Migration 036
Phase 3: Dokument-Versionierung
- 5 Versions-Tabellen (DSFA, VVT, TOM, Loeschfristen, Obligations)
- Shared versioning_utils.py Helper
- /{id}/versions Endpoints auf allen 5 Dokumenttypen
- Migration 037
Phase 4: Change-Request System
- Zentrale CR-Inbox mit CRUD + Accept/Reject/Edit Workflow
- Regelbasierte CR-Engine (VVT DPIA → DSFA CR, Datenkategorien → Loeschfristen CR)
- Audit-Trail
- Migration 038
Phase 5: Dokumentengenerierung
- 5 Template-Generatoren (DSFA, VVT, TOM, Loeschfristen, Obligations)
- Preview + Apply Endpoints (erzeugt CRs, keine direkten Dokumente)
Phase 6: Frontend-Integration
- Change-Request Inbox Page mit Stats, Filtern, Modals
- VersionHistory Timeline-Komponente
- SDKSidebar CR-Badge (60s Polling)
- Company Profile: 2 neue Wizard-Steps + "Dokumente generieren" CTA
Docs: 5 neue MkDocs-Seiten, CLAUDE.md aktualisiert
Tests: 97 neue Tests (alle bestanden)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -269,18 +269,55 @@ POST/GET /api/v1/compliance/evidence
|
||||
POST/GET /api/v1/dsr/requests
|
||||
POST/GET /api/v1/gdpr/exports
|
||||
POST/GET /api/v1/consent/admin
|
||||
|
||||
# Stammdaten, Versionierung & Change-Requests (Phase 1-6, 2026-03-07)
|
||||
GET/POST/DELETE /api/compliance/company-profile
|
||||
GET /api/compliance/company-profile/template-context
|
||||
GET /api/compliance/change-requests
|
||||
GET /api/compliance/change-requests/stats
|
||||
POST /api/compliance/change-requests/{id}/accept
|
||||
POST /api/compliance/change-requests/{id}/reject
|
||||
POST /api/compliance/change-requests/{id}/edit
|
||||
GET /api/compliance/generation/preview/{doc_type}
|
||||
POST /api/compliance/generation/apply/{doc_type}
|
||||
GET /api/compliance/{doc}/{id}/versions
|
||||
```
|
||||
|
||||
### Multi-Tenancy
|
||||
- Shared Dependency: `compliance/api/tenant_utils.py` (`get_tenant_id()`)
|
||||
- UUID-Format, kein `"default"` mehr
|
||||
- Header `X-Tenant-ID` > Query `tenant_id` > ENV-Fallback
|
||||
|
||||
### Migrations (035-038)
|
||||
| Nr | Datei | Beschreibung |
|
||||
|----|-------|--------------|
|
||||
| 035 | `migrations/035_vvt_tenant_isolation.sql` | VVT tenant_id + DSFA/Vendor default→UUID |
|
||||
| 036 | `migrations/036_company_profile_extend.sql` | Stammdaten JSONB + Regulierungs-Flags |
|
||||
| 037 | `migrations/037_document_versions.sql` | 5 Versions-Tabellen + current_version |
|
||||
| 038 | `migrations/038_change_requests.sql` | Change-Requests + Audit-Log |
|
||||
|
||||
### Neue Backend-Module
|
||||
| Datei | Beschreibung |
|
||||
|-------|--------------|
|
||||
| `compliance/api/tenant_utils.py` | Shared Tenant-ID Dependency |
|
||||
| `compliance/api/versioning_utils.py` | Shared Versioning Helper |
|
||||
| `compliance/api/change_request_routes.py` | CR CRUD + Accept/Reject/Edit |
|
||||
| `compliance/api/change_request_engine.py` | Regelbasierte CR-Generierung |
|
||||
| `compliance/api/generation_routes.py` | Dokumentengenerierung aus Stammdaten |
|
||||
| `compliance/api/document_templates/` | 5 Template-Generatoren (DSFA, VVT, TOM, etc.) |
|
||||
|
||||
---
|
||||
|
||||
## Wichtige Dateien (Referenz)
|
||||
|
||||
| Datei | Beschreibung |
|
||||
|-------|--------------|
|
||||
| `admin-compliance/app/(sdk)/` | Alle 37 SDK-Routes |
|
||||
| `admin-compliance/components/sdk/SDKSidebar.tsx` | SDK Navigation |
|
||||
| `admin-compliance/app/(sdk)/` | Alle 37+ SDK-Routes |
|
||||
| `admin-compliance/app/(sdk)/sdk/change-requests/page.tsx` | Change-Request Inbox |
|
||||
| `admin-compliance/components/sdk/Sidebar/SDKSidebar.tsx` | SDK Navigation (mit CR-Badge) |
|
||||
| `admin-compliance/components/sdk/VersionHistory.tsx` | Versions-Timeline-Komponente |
|
||||
| `admin-compliance/components/sdk/CommandBar.tsx` | Command Palette |
|
||||
| `admin-compliance/lib/sdk/context.tsx` | SDK State (Provider) |
|
||||
| `backend-compliance/compliance/` | Haupt-Package (40 Dateien) |
|
||||
| `backend-compliance/compliance/` | Haupt-Package (50+ Dateien) |
|
||||
| `ai-compliance-sdk/` | KI-Compliance Analyse |
|
||||
| `developer-portal/` | API-Dokumentation |
|
||||
|
||||
345
admin-compliance/app/(sdk)/sdk/change-requests/page.tsx
Normal file
345
admin-compliance/app/(sdk)/sdk/change-requests/page.tsx
Normal file
@@ -0,0 +1,345 @@
|
||||
'use client'
|
||||
|
||||
import { useState, useEffect, useCallback } from 'react'
|
||||
|
||||
interface ChangeRequest {
|
||||
id: string
|
||||
triggerType: string
|
||||
targetDocumentType: string
|
||||
targetDocumentId: string | null
|
||||
targetSection: string | null
|
||||
proposalTitle: string
|
||||
proposalBody: string | null
|
||||
proposedChanges: Record<string, unknown>
|
||||
status: 'pending' | 'accepted' | 'rejected' | 'edited_and_accepted'
|
||||
priority: 'low' | 'normal' | 'high' | 'critical'
|
||||
decidedBy: string | null
|
||||
decidedAt: string | null
|
||||
rejectionReason: string | null
|
||||
createdBy: string
|
||||
createdAt: string
|
||||
}
|
||||
|
||||
interface Stats {
|
||||
total_pending: number
|
||||
critical_count: number
|
||||
total_accepted: number
|
||||
total_rejected: number
|
||||
by_document_type: Record<string, number>
|
||||
}
|
||||
|
||||
const API_BASE = '/api/sdk/v1/compliance/change-requests'
|
||||
|
||||
const DOC_TYPE_LABELS: Record<string, string> = {
|
||||
dsfa: 'DSFA',
|
||||
vvt: 'VVT',
|
||||
tom: 'TOM',
|
||||
loeschfristen: 'Löschfristen',
|
||||
obligation: 'Pflichten',
|
||||
}
|
||||
|
||||
const PRIORITY_COLORS: Record<string, string> = {
|
||||
critical: 'bg-red-100 text-red-800',
|
||||
high: 'bg-orange-100 text-orange-800',
|
||||
normal: 'bg-blue-100 text-blue-800',
|
||||
low: 'bg-gray-100 text-gray-700',
|
||||
}
|
||||
|
||||
const STATUS_COLORS: Record<string, string> = {
|
||||
pending: 'bg-yellow-100 text-yellow-800',
|
||||
accepted: 'bg-green-100 text-green-800',
|
||||
rejected: 'bg-red-100 text-red-800',
|
||||
edited_and_accepted: 'bg-emerald-100 text-emerald-800',
|
||||
}
|
||||
|
||||
function snakeToCamel(obj: Record<string, unknown>): ChangeRequest {
|
||||
return {
|
||||
id: obj.id as string,
|
||||
triggerType: obj.trigger_type as string,
|
||||
targetDocumentType: obj.target_document_type as string,
|
||||
targetDocumentId: obj.target_document_id as string | null,
|
||||
targetSection: obj.target_section as string | null,
|
||||
proposalTitle: obj.proposal_title as string,
|
||||
proposalBody: obj.proposal_body as string | null,
|
||||
proposedChanges: (obj.proposed_changes || {}) as Record<string, unknown>,
|
||||
status: obj.status as ChangeRequest['status'],
|
||||
priority: obj.priority as ChangeRequest['priority'],
|
||||
decidedBy: obj.decided_by as string | null,
|
||||
decidedAt: obj.decided_at as string | null,
|
||||
rejectionReason: obj.rejection_reason as string | null,
|
||||
createdBy: obj.created_by as string,
|
||||
createdAt: obj.created_at as string,
|
||||
}
|
||||
}
|
||||
|
||||
export default function ChangeRequestsPage() {
|
||||
const [requests, setRequests] = useState<ChangeRequest[]>([])
|
||||
const [stats, setStats] = useState<Stats | null>(null)
|
||||
const [filter, setFilter] = useState<string>('all')
|
||||
const [statusFilter, setStatusFilter] = useState<string>('pending')
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [actionModal, setActionModal] = useState<{ type: 'accept' | 'reject' | 'edit'; cr: ChangeRequest } | null>(null)
|
||||
const [rejectReason, setRejectReason] = useState('')
|
||||
const [editBody, setEditBody] = useState('')
|
||||
|
||||
const loadData = useCallback(async () => {
|
||||
try {
|
||||
const statsRes = await fetch(`${API_BASE}/stats`)
|
||||
if (statsRes.ok) setStats(await statsRes.json())
|
||||
|
||||
let url = `${API_BASE}?status=${statusFilter}`
|
||||
if (filter !== 'all') url += `&target_document_type=${filter}`
|
||||
const res = await fetch(url)
|
||||
if (res.ok) {
|
||||
const data = await res.json()
|
||||
setRequests((Array.isArray(data) ? data : []).map(snakeToCamel))
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('Failed to load change requests:', e)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
}, [filter, statusFilter])
|
||||
|
||||
useEffect(() => {
|
||||
loadData()
|
||||
const interval = setInterval(loadData, 60000)
|
||||
return () => clearInterval(interval)
|
||||
}, [loadData])
|
||||
|
||||
const handleAccept = async (cr: ChangeRequest) => {
|
||||
const res = await fetch(`${API_BASE}/${cr.id}/accept`, { method: 'POST' })
|
||||
if (res.ok) {
|
||||
setActionModal(null)
|
||||
loadData()
|
||||
}
|
||||
}
|
||||
|
||||
const handleReject = async (cr: ChangeRequest) => {
|
||||
const res = await fetch(`${API_BASE}/${cr.id}/reject`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ rejection_reason: rejectReason }),
|
||||
})
|
||||
if (res.ok) {
|
||||
setActionModal(null)
|
||||
setRejectReason('')
|
||||
loadData()
|
||||
}
|
||||
}
|
||||
|
||||
const handleEdit = async (cr: ChangeRequest) => {
|
||||
const res = await fetch(`${API_BASE}/${cr.id}/edit`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ proposal_body: editBody }),
|
||||
})
|
||||
if (res.ok) {
|
||||
setActionModal(null)
|
||||
setEditBody('')
|
||||
loadData()
|
||||
}
|
||||
}
|
||||
|
||||
const handleDelete = async (id: string) => {
|
||||
if (!confirm('Änderungsanfrage wirklich löschen?')) return
|
||||
const res = await fetch(`${API_BASE}/${id}`, { method: 'DELETE' })
|
||||
if (res.ok) loadData()
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="p-6 max-w-7xl mx-auto">
|
||||
<h1 className="text-2xl font-bold text-gray-900 mb-6">Änderungsanfragen</h1>
|
||||
|
||||
{/* Stats Bar */}
|
||||
{stats && (
|
||||
<div className="grid grid-cols-2 md:grid-cols-4 gap-4 mb-6">
|
||||
<div className="bg-yellow-50 border border-yellow-200 rounded-lg p-4">
|
||||
<div className="text-2xl font-bold text-yellow-700">{stats.total_pending}</div>
|
||||
<div className="text-sm text-yellow-600">Offen</div>
|
||||
</div>
|
||||
<div className="bg-red-50 border border-red-200 rounded-lg p-4">
|
||||
<div className="text-2xl font-bold text-red-700">{stats.critical_count}</div>
|
||||
<div className="text-sm text-red-600">Kritisch</div>
|
||||
</div>
|
||||
<div className="bg-green-50 border border-green-200 rounded-lg p-4">
|
||||
<div className="text-2xl font-bold text-green-700">{stats.total_accepted}</div>
|
||||
<div className="text-sm text-green-600">Angenommen</div>
|
||||
</div>
|
||||
<div className="bg-gray-50 border border-gray-200 rounded-lg p-4">
|
||||
<div className="text-2xl font-bold text-gray-700">{stats.total_rejected}</div>
|
||||
<div className="text-sm text-gray-600">Abgelehnt</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Filters */}
|
||||
<div className="flex flex-wrap gap-2 mb-6">
|
||||
<div className="flex gap-1 bg-gray-100 rounded-lg p-1">
|
||||
{['pending', 'accepted', 'rejected'].map(s => (
|
||||
<button
|
||||
key={s}
|
||||
onClick={() => setStatusFilter(s)}
|
||||
className={`px-3 py-1 rounded-md text-sm font-medium transition-colors ${
|
||||
statusFilter === s ? 'bg-white shadow text-purple-700' : 'text-gray-600 hover:text-gray-900'
|
||||
}`}
|
||||
>
|
||||
{s === 'pending' ? 'Offen' : s === 'accepted' ? 'Angenommen' : 'Abgelehnt'}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
<div className="flex gap-1 bg-gray-100 rounded-lg p-1">
|
||||
{['all', 'dsfa', 'vvt', 'tom', 'loeschfristen', 'obligation'].map(t => (
|
||||
<button
|
||||
key={t}
|
||||
onClick={() => setFilter(t)}
|
||||
className={`px-3 py-1 rounded-md text-sm font-medium transition-colors ${
|
||||
filter === t ? 'bg-white shadow text-purple-700' : 'text-gray-600 hover:text-gray-900'
|
||||
}`}
|
||||
>
|
||||
{t === 'all' ? 'Alle' : DOC_TYPE_LABELS[t] || t}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Request List */}
|
||||
{loading ? (
|
||||
<div className="text-center py-12 text-gray-500">Laden...</div>
|
||||
) : requests.length === 0 ? (
|
||||
<div className="text-center py-12 text-gray-400">
|
||||
Keine Änderungsanfragen {statusFilter === 'pending' ? 'offen' : 'gefunden'}
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-3">
|
||||
{requests.map(cr => (
|
||||
<div key={cr.id} className="bg-white border border-gray-200 rounded-lg p-4 hover:shadow-sm transition-shadow">
|
||||
<div className="flex items-start justify-between">
|
||||
<div className="flex-1">
|
||||
<div className="flex items-center gap-2 mb-1">
|
||||
<span className={`px-2 py-0.5 rounded text-xs font-medium ${PRIORITY_COLORS[cr.priority]}`}>
|
||||
{cr.priority}
|
||||
</span>
|
||||
<span className={`px-2 py-0.5 rounded text-xs font-medium ${STATUS_COLORS[cr.status]}`}>
|
||||
{cr.status}
|
||||
</span>
|
||||
<span className="px-2 py-0.5 rounded text-xs font-medium bg-purple-100 text-purple-700">
|
||||
{DOC_TYPE_LABELS[cr.targetDocumentType] || cr.targetDocumentType}
|
||||
</span>
|
||||
{cr.triggerType !== 'manual' && (
|
||||
<span className="px-2 py-0.5 rounded text-xs bg-gray-100 text-gray-600">
|
||||
{cr.triggerType}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<h3 className="font-medium text-gray-900">{cr.proposalTitle}</h3>
|
||||
{cr.proposalBody && (
|
||||
<p className="text-sm text-gray-600 mt-1 line-clamp-2">{cr.proposalBody}</p>
|
||||
)}
|
||||
<div className="text-xs text-gray-400 mt-2">
|
||||
{new Date(cr.createdAt).toLocaleDateString('de-DE')} — {cr.createdBy}
|
||||
{cr.rejectionReason && (
|
||||
<span className="text-red-500 ml-2">Grund: {cr.rejectionReason}</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{cr.status === 'pending' && (
|
||||
<div className="flex gap-2 ml-4">
|
||||
<button
|
||||
onClick={() => { setActionModal({ type: 'accept', cr }) }}
|
||||
className="px-3 py-1 bg-green-600 text-white rounded text-sm hover:bg-green-700"
|
||||
>
|
||||
Annehmen
|
||||
</button>
|
||||
<button
|
||||
onClick={() => { setEditBody(cr.proposalBody || ''); setActionModal({ type: 'edit', cr }) }}
|
||||
className="px-3 py-1 bg-blue-600 text-white rounded text-sm hover:bg-blue-700"
|
||||
>
|
||||
Bearbeiten
|
||||
</button>
|
||||
<button
|
||||
onClick={() => { setRejectReason(''); setActionModal({ type: 'reject', cr }) }}
|
||||
className="px-3 py-1 bg-gray-200 text-gray-700 rounded text-sm hover:bg-gray-300"
|
||||
>
|
||||
Ablehnen
|
||||
</button>
|
||||
<button
|
||||
onClick={() => handleDelete(cr.id)}
|
||||
className="px-3 py-1 text-red-600 rounded text-sm hover:bg-red-50"
|
||||
>
|
||||
Löschen
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Action Modal */}
|
||||
{actionModal && (
|
||||
<div className="fixed inset-0 bg-black/40 flex items-center justify-center z-50" onClick={() => setActionModal(null)}>
|
||||
<div className="bg-white rounded-xl p-6 w-full max-w-lg" onClick={e => e.stopPropagation()}>
|
||||
{actionModal.type === 'accept' && (
|
||||
<>
|
||||
<h3 className="text-lg font-semibold mb-4">Änderung annehmen?</h3>
|
||||
<p className="text-sm text-gray-600 mb-4">{actionModal.cr.proposalTitle}</p>
|
||||
{Object.keys(actionModal.cr.proposedChanges).length > 0 && (
|
||||
<pre className="bg-gray-50 p-3 rounded text-xs mb-4 max-h-48 overflow-auto">
|
||||
{JSON.stringify(actionModal.cr.proposedChanges, null, 2)}
|
||||
</pre>
|
||||
)}
|
||||
<div className="flex justify-end gap-3">
|
||||
<button onClick={() => setActionModal(null)} className="px-4 py-2 text-gray-600">Abbrechen</button>
|
||||
<button onClick={() => handleAccept(actionModal.cr)} className="px-4 py-2 bg-green-600 text-white rounded-lg">Annehmen</button>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{actionModal.type === 'reject' && (
|
||||
<>
|
||||
<h3 className="text-lg font-semibold mb-4">Änderung ablehnen</h3>
|
||||
<textarea
|
||||
value={rejectReason}
|
||||
onChange={e => setRejectReason(e.target.value)}
|
||||
placeholder="Begründung..."
|
||||
className="w-full border rounded-lg p-3 h-24 mb-4"
|
||||
/>
|
||||
<div className="flex justify-end gap-3">
|
||||
<button onClick={() => setActionModal(null)} className="px-4 py-2 text-gray-600">Abbrechen</button>
|
||||
<button
|
||||
onClick={() => handleReject(actionModal.cr)}
|
||||
disabled={!rejectReason.trim()}
|
||||
className="px-4 py-2 bg-red-600 text-white rounded-lg disabled:opacity-50"
|
||||
>
|
||||
Ablehnen
|
||||
</button>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{actionModal.type === 'edit' && (
|
||||
<>
|
||||
<h3 className="text-lg font-semibold mb-4">Vorschlag bearbeiten & annehmen</h3>
|
||||
<textarea
|
||||
value={editBody}
|
||||
onChange={e => setEditBody(e.target.value)}
|
||||
className="w-full border rounded-lg p-3 h-32 mb-4"
|
||||
/>
|
||||
<div className="flex justify-end gap-3">
|
||||
<button onClick={() => setActionModal(null)} className="px-4 py-2 text-gray-600">Abbrechen</button>
|
||||
<button
|
||||
onClick={() => handleEdit(actionModal.cr)}
|
||||
className="px-4 py-2 bg-blue-600 text-white rounded-lg"
|
||||
>
|
||||
Speichern & Annehmen
|
||||
</button>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -35,9 +35,11 @@ const BASE_WIZARD_STEPS = [
|
||||
{ id: 3, name: 'Firmengroesse', description: 'Mitarbeiter und Umsatz' },
|
||||
{ id: 4, name: 'Standorte', description: 'Hauptsitz und Zielmaerkte' },
|
||||
{ id: 5, name: 'Datenschutz', description: 'Rollen und KI-Nutzung' },
|
||||
{ id: 6, name: 'Systeme & KI', description: 'IT-Systeme und KI-Katalog' },
|
||||
{ id: 7, name: 'Rechtlicher Rahmen', description: 'Regulierungen und Prüfzyklen' },
|
||||
]
|
||||
|
||||
const MACHINE_BUILDER_STEP = { id: 6, name: 'Produkt & Maschine', description: 'Software, KI und CE in Ihrem Produkt' }
|
||||
const MACHINE_BUILDER_STEP = { id: 8, name: 'Produkt & Maschine', description: 'Software, KI und CE in Ihrem Produkt' }
|
||||
|
||||
function getWizardSteps(industry: string) {
|
||||
if (isMachineBuilderIndustry(industry)) {
|
||||
@@ -542,7 +544,277 @@ function StepDataProtection({
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// STEP 6: PRODUKT & MASCHINE (nur fuer Maschinenbauer)
|
||||
// STEP 6: SYSTEME & KI
|
||||
// =============================================================================
|
||||
|
||||
interface ProcessingSystem {
|
||||
name: string
|
||||
vendor: string
|
||||
hosting: string
|
||||
personal_data_categories: string[]
|
||||
}
|
||||
|
||||
interface AISystem {
|
||||
name: string
|
||||
purpose: string
|
||||
risk_category: string
|
||||
vendor: string
|
||||
has_human_oversight: boolean
|
||||
}
|
||||
|
||||
function StepSystemsAndAI({
|
||||
data,
|
||||
onChange,
|
||||
}: {
|
||||
data: Partial<CompanyProfile> & { processingSystems?: ProcessingSystem[]; aiSystems?: AISystem[] }
|
||||
onChange: (updates: Record<string, unknown>) => void
|
||||
}) {
|
||||
const systems = (data as any).processingSystems || []
|
||||
const aiSystems = (data as any).aiSystems || []
|
||||
|
||||
const addSystem = () => {
|
||||
onChange({ processingSystems: [...systems, { name: '', vendor: '', hosting: 'cloud', personal_data_categories: [] }] })
|
||||
}
|
||||
const removeSystem = (i: number) => {
|
||||
onChange({ processingSystems: systems.filter((_: ProcessingSystem, idx: number) => idx !== i) })
|
||||
}
|
||||
const updateSystem = (i: number, updates: Partial<ProcessingSystem>) => {
|
||||
const updated = [...systems]
|
||||
updated[i] = { ...updated[i], ...updates }
|
||||
onChange({ processingSystems: updated })
|
||||
}
|
||||
|
||||
const addAISystem = () => {
|
||||
onChange({ aiSystems: [...aiSystems, { name: '', purpose: '', risk_category: 'limited', vendor: '', has_human_oversight: true }] })
|
||||
}
|
||||
const removeAISystem = (i: number) => {
|
||||
onChange({ aiSystems: aiSystems.filter((_: AISystem, idx: number) => idx !== i) })
|
||||
}
|
||||
const updateAISystem = (i: number, updates: Partial<AISystem>) => {
|
||||
const updated = [...aiSystems]
|
||||
updated[i] = { ...updated[i], ...updates }
|
||||
onChange({ aiSystems: updated })
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-8">
|
||||
{/* Processing Systems */}
|
||||
<div>
|
||||
<div className="flex items-center justify-between mb-4">
|
||||
<div>
|
||||
<h3 className="text-sm font-medium text-gray-700">IT-Systeme mit personenbezogenen Daten</h3>
|
||||
<p className="text-xs text-gray-500">Systeme, die personenbezogene Daten verarbeiten (fuer VVT-Generierung)</p>
|
||||
</div>
|
||||
<button type="button" onClick={addSystem} className="px-3 py-1.5 text-sm bg-purple-100 text-purple-700 rounded-lg hover:bg-purple-200">
|
||||
+ System
|
||||
</button>
|
||||
</div>
|
||||
{systems.length === 0 && (
|
||||
<div className="text-center py-6 text-gray-400 border-2 border-dashed rounded-lg">Noch keine Systeme hinzugefuegt</div>
|
||||
)}
|
||||
<div className="space-y-3">
|
||||
{systems.map((sys: ProcessingSystem, i: number) => (
|
||||
<div key={i} className="border border-gray-200 rounded-lg p-4 space-y-3">
|
||||
<div className="flex justify-between items-center">
|
||||
<span className="text-xs font-medium text-gray-400">System {i + 1}</span>
|
||||
<button type="button" onClick={() => removeSystem(i)} className="text-red-400 hover:text-red-600 text-xs">Entfernen</button>
|
||||
</div>
|
||||
<div className="grid grid-cols-2 gap-3">
|
||||
<input type="text" value={sys.name} onChange={e => updateSystem(i, { name: e.target.value })} placeholder="Name (z.B. SAP HR)" className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
<input type="text" value={sys.vendor} onChange={e => updateSystem(i, { vendor: e.target.value })} placeholder="Hersteller" className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
</div>
|
||||
<div className="grid grid-cols-2 gap-3">
|
||||
<select value={sys.hosting} onChange={e => updateSystem(i, { hosting: e.target.value })} className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent">
|
||||
<option value="on-premise">On-Premise</option>
|
||||
<option value="cloud">Cloud (EU)</option>
|
||||
<option value="us-cloud">Cloud (US)</option>
|
||||
<option value="hybrid">Hybrid</option>
|
||||
</select>
|
||||
<input type="text" value={sys.personal_data_categories.join(', ')} onChange={e => updateSystem(i, { personal_data_categories: e.target.value.split(',').map(s => s.trim()).filter(Boolean) })} placeholder="Datenkategorien (kommagetrennt)" className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* AI Systems */}
|
||||
<div className="border-t border-gray-200 pt-6">
|
||||
<div className="flex items-center justify-between mb-4">
|
||||
<div>
|
||||
<h3 className="text-sm font-medium text-gray-700">KI-Systeme</h3>
|
||||
<p className="text-xs text-gray-500">Strukturierter KI-Katalog fuer AI Act Compliance</p>
|
||||
</div>
|
||||
<button type="button" onClick={addAISystem} className="px-3 py-1.5 text-sm bg-purple-100 text-purple-700 rounded-lg hover:bg-purple-200">
|
||||
+ KI-System
|
||||
</button>
|
||||
</div>
|
||||
{aiSystems.length === 0 && (
|
||||
<div className="text-center py-6 text-gray-400 border-2 border-dashed rounded-lg">Noch keine KI-Systeme</div>
|
||||
)}
|
||||
<div className="space-y-3">
|
||||
{aiSystems.map((ai: AISystem, i: number) => (
|
||||
<div key={i} className="border border-gray-200 rounded-lg p-4 space-y-3">
|
||||
<div className="flex justify-between items-center">
|
||||
<span className="text-xs font-medium text-gray-400">KI-System {i + 1}</span>
|
||||
<button type="button" onClick={() => removeAISystem(i)} className="text-red-400 hover:text-red-600 text-xs">Entfernen</button>
|
||||
</div>
|
||||
<div className="grid grid-cols-2 gap-3">
|
||||
<input type="text" value={ai.name} onChange={e => updateAISystem(i, { name: e.target.value })} placeholder="Name (z.B. Chatbot)" className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
<input type="text" value={ai.vendor} onChange={e => updateAISystem(i, { vendor: e.target.value })} placeholder="Anbieter" className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
</div>
|
||||
<input type="text" value={ai.purpose} onChange={e => updateAISystem(i, { purpose: e.target.value })} placeholder="Zweck (z.B. Kundensupport)" className="w-full px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
<div className="grid grid-cols-2 gap-3">
|
||||
<select value={ai.risk_category} onChange={e => updateAISystem(i, { risk_category: e.target.value })} className="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent">
|
||||
<option value="minimal">Minimal Risk</option>
|
||||
<option value="limited">Limited Risk</option>
|
||||
<option value="high">High Risk</option>
|
||||
<option value="unacceptable">Unacceptable Risk</option>
|
||||
</select>
|
||||
<label className="flex items-center gap-2 px-3 py-2">
|
||||
<input type="checkbox" checked={ai.has_human_oversight} onChange={e => updateAISystem(i, { has_human_oversight: e.target.checked })} className="w-4 h-4 text-purple-600 rounded focus:ring-purple-500" />
|
||||
<span className="text-sm text-gray-700">Human Oversight</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// STEP 7: RECHTLICHER RAHMEN
|
||||
// =============================================================================
|
||||
|
||||
function StepLegalFramework({
|
||||
data,
|
||||
onChange,
|
||||
}: {
|
||||
data: Partial<CompanyProfile> & { subjectToNis2?: boolean; subjectToAiAct?: boolean; subjectToIso27001?: boolean; supervisoryAuthority?: string; reviewCycleMonths?: number; technicalContacts?: { name: string; role: string; email: string }[] }
|
||||
onChange: (updates: Record<string, unknown>) => void
|
||||
}) {
|
||||
const contacts = (data as any).technicalContacts || []
|
||||
|
||||
const addContact = () => {
|
||||
onChange({ technicalContacts: [...contacts, { name: '', role: '', email: '' }] })
|
||||
}
|
||||
const removeContact = (i: number) => {
|
||||
onChange({ technicalContacts: contacts.filter((_: { name: string; role: string; email: string }, idx: number) => idx !== i) })
|
||||
}
|
||||
const updateContact = (i: number, updates: Partial<{ name: string; role: string; email: string }>) => {
|
||||
const updated = [...contacts]
|
||||
updated[i] = { ...updated[i], ...updates }
|
||||
onChange({ technicalContacts: updated })
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-8">
|
||||
{/* Regulatory Flags */}
|
||||
<div>
|
||||
<h3 className="text-sm font-medium text-gray-700 mb-4">Regulatorischer Rahmen</h3>
|
||||
<div className="space-y-3">
|
||||
{[
|
||||
{ key: 'subjectToNis2', label: 'NIS2-Richtlinie', desc: 'Ihr Unternehmen fällt unter die NIS2-Richtlinie (Netzwerk- und Informationssicherheit)' },
|
||||
{ key: 'subjectToAiAct', label: 'EU AI Act', desc: 'Ihr Unternehmen setzt KI-Systeme ein, die unter den AI Act fallen' },
|
||||
{ key: 'subjectToIso27001', label: 'ISO 27001', desc: 'Ihr Unternehmen strebt ISO 27001 Zertifizierung an oder ist bereits zertifiziert' },
|
||||
].map(item => (
|
||||
<label
|
||||
key={item.key}
|
||||
className={`flex items-start gap-4 p-4 rounded-xl border-2 cursor-pointer transition-all ${
|
||||
(data as any)[item.key] ? 'border-purple-500 bg-purple-50' : 'border-gray-200 hover:border-purple-300'
|
||||
}`}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={(data as any)[item.key] ?? false}
|
||||
onChange={e => onChange({ [item.key]: e.target.checked })}
|
||||
className="mt-1 w-5 h-5 text-purple-600 rounded focus:ring-purple-500"
|
||||
/>
|
||||
<div>
|
||||
<div className="font-medium text-gray-900">{item.label}</div>
|
||||
<div className="text-sm text-gray-500">{item.desc}</div>
|
||||
</div>
|
||||
</label>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Supervisory Authority & Review Cycle */}
|
||||
<div className="grid grid-cols-2 gap-6">
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 mb-2">Aufsichtsbehörde</label>
|
||||
<select
|
||||
value={(data as any).supervisoryAuthority || ''}
|
||||
onChange={e => onChange({ supervisoryAuthority: e.target.value })}
|
||||
className="w-full px-4 py-3 border border-gray-300 rounded-lg focus:ring-2 focus:ring-purple-500 focus:border-transparent"
|
||||
>
|
||||
<option value="">Bitte wählen...</option>
|
||||
<option value="LfDI BW">LfDI Baden-Württemberg</option>
|
||||
<option value="BayLDA">BayLDA Bayern</option>
|
||||
<option value="BlnBDI">BlnBDI Berlin</option>
|
||||
<option value="LDA BB">LDA Brandenburg</option>
|
||||
<option value="LfDI HB">LfDI Bremen</option>
|
||||
<option value="HmbBfDI">HmbBfDI Hamburg</option>
|
||||
<option value="HBDI">HBDI Hessen</option>
|
||||
<option value="LfDI MV">LfDI Mecklenburg-Vorpommern</option>
|
||||
<option value="LfD NI">LfD Niedersachsen</option>
|
||||
<option value="LDI NRW">LDI NRW</option>
|
||||
<option value="LfDI RP">LfDI Rheinland-Pfalz</option>
|
||||
<option value="UDZ SL">UDZ Saarland</option>
|
||||
<option value="SächsDSB">Sächsischer DSB</option>
|
||||
<option value="LfD LSA">LfD Sachsen-Anhalt</option>
|
||||
<option value="ULD SH">ULD Schleswig-Holstein</option>
|
||||
<option value="TLfDI">TLfDI Thüringen</option>
|
||||
<option value="BfDI">BfDI (Bund)</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 mb-2">Prüfzyklus (Monate)</label>
|
||||
<select
|
||||
value={(data as any).reviewCycleMonths || 12}
|
||||
onChange={e => onChange({ reviewCycleMonths: parseInt(e.target.value) })}
|
||||
className="w-full px-4 py-3 border border-gray-300 rounded-lg focus:ring-2 focus:ring-purple-500 focus:border-transparent"
|
||||
>
|
||||
<option value={3}>Vierteljährlich (3 Monate)</option>
|
||||
<option value={6}>Halbjährlich (6 Monate)</option>
|
||||
<option value={12}>Jährlich (12 Monate)</option>
|
||||
<option value={24}>Zweijährlich (24 Monate)</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Technical Contacts */}
|
||||
<div className="border-t border-gray-200 pt-6">
|
||||
<div className="flex items-center justify-between mb-4">
|
||||
<div>
|
||||
<h3 className="text-sm font-medium text-gray-700">Technische Ansprechpartner</h3>
|
||||
<p className="text-xs text-gray-500">CISO, IT-Manager, DSB etc.</p>
|
||||
</div>
|
||||
<button type="button" onClick={addContact} className="px-3 py-1.5 text-sm bg-purple-100 text-purple-700 rounded-lg hover:bg-purple-200">
|
||||
+ Kontakt
|
||||
</button>
|
||||
</div>
|
||||
{contacts.length === 0 && (
|
||||
<div className="text-center py-4 text-gray-400 border-2 border-dashed rounded-lg text-sm">Noch keine Kontakte</div>
|
||||
)}
|
||||
<div className="space-y-3">
|
||||
{contacts.map((c: { name: string; role: string; email: string }, i: number) => (
|
||||
<div key={i} className="flex gap-3 items-center">
|
||||
<input type="text" value={c.name} onChange={e => updateContact(i, { name: e.target.value })} placeholder="Name" className="flex-1 px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
<input type="text" value={c.role} onChange={e => updateContact(i, { role: e.target.value })} placeholder="Rolle (z.B. CISO)" className="flex-1 px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
<input type="email" value={c.email} onChange={e => updateContact(i, { email: e.target.value })} placeholder="E-Mail" className="flex-1 px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-purple-500 focus:border-transparent" />
|
||||
<button type="button" onClick={() => removeContact(i)} className="text-red-400 hover:text-red-600 text-sm">X</button>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// STEP 8: PRODUKT & MASCHINE (nur fuer Maschinenbauer)
|
||||
// =============================================================================
|
||||
|
||||
const EMPTY_MACHINE_BUILDER: MachineBuilderProfile = {
|
||||
@@ -1090,6 +1362,71 @@ function CoverageAssessmentPanel({ profile }: { profile: Partial<CompanyProfile>
|
||||
)
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// GENERATE DOCUMENTS BUTTON
|
||||
// =============================================================================
|
||||
|
||||
const DOC_TYPES = [
|
||||
{ id: 'dsfa', label: 'DSFA', desc: 'Datenschutz-Folgenabschätzung' },
|
||||
{ id: 'vvt', label: 'VVT', desc: 'Verarbeitungsverzeichnis' },
|
||||
{ id: 'tom', label: 'TOM', desc: 'Technisch-Organisatorische Maßnahmen' },
|
||||
{ id: 'loeschfristen', label: 'Löschfristen', desc: 'Löschfristen-Katalog' },
|
||||
{ id: 'obligation', label: 'Pflichten', desc: 'Compliance-Pflichten' },
|
||||
]
|
||||
|
||||
function GenerateDocumentsButton() {
|
||||
const [generating, setGenerating] = useState<string | null>(null)
|
||||
const [results, setResults] = useState<Record<string, { ok: boolean; count: number }>>({})
|
||||
|
||||
const handleGenerate = async (docType: string) => {
|
||||
setGenerating(docType)
|
||||
try {
|
||||
const res = await fetch(`/api/sdk/v1/compliance/generation/apply/${docType}`, { method: 'POST' })
|
||||
if (res.ok) {
|
||||
const data = await res.json()
|
||||
setResults(prev => ({ ...prev, [docType]: { ok: true, count: data.change_requests_created || 0 } }))
|
||||
} else {
|
||||
setResults(prev => ({ ...prev, [docType]: { ok: false, count: 0 } }))
|
||||
}
|
||||
} catch {
|
||||
setResults(prev => ({ ...prev, [docType]: { ok: false, count: 0 } }))
|
||||
} finally {
|
||||
setGenerating(null)
|
||||
}
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-2">
|
||||
{DOC_TYPES.map(dt => (
|
||||
<div key={dt.id} className="flex items-center justify-between">
|
||||
<div>
|
||||
<span className="text-sm font-medium text-gray-900">{dt.label}</span>
|
||||
<span className="text-xs text-gray-500 ml-1">({dt.desc})</span>
|
||||
</div>
|
||||
{results[dt.id] ? (
|
||||
<span className={`text-xs px-2 py-1 rounded ${results[dt.id].ok ? 'bg-green-100 text-green-700' : 'bg-red-100 text-red-700'}`}>
|
||||
{results[dt.id].ok ? `${results[dt.id].count} CR erstellt` : 'Fehler'}
|
||||
</span>
|
||||
) : (
|
||||
<button
|
||||
onClick={() => handleGenerate(dt.id)}
|
||||
disabled={generating !== null}
|
||||
className="px-3 py-1 text-xs bg-purple-600 text-white rounded hover:bg-purple-700 disabled:opacity-50"
|
||||
>
|
||||
{generating === dt.id ? 'Generiere...' : 'Generieren'}
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
{Object.keys(results).length > 0 && (
|
||||
<a href="/sdk/change-requests" className="block text-center text-sm text-purple-600 hover:text-purple-800 font-medium mt-3">
|
||||
Zur Änderungsanfragen-Inbox →
|
||||
</a>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// MAIN COMPONENT
|
||||
// =============================================================================
|
||||
@@ -1165,10 +1502,21 @@ export default function CompanyProfilePage() {
|
||||
dpoName: data.dpo_name || '',
|
||||
dpoEmail: data.dpo_email || '',
|
||||
isComplete: data.is_complete || false,
|
||||
}
|
||||
// Phase 2 extended fields
|
||||
processingSystems: data.processing_systems || [],
|
||||
aiSystems: data.ai_systems || [],
|
||||
technicalContacts: data.technical_contacts || [],
|
||||
subjectToNis2: data.subject_to_nis2 || false,
|
||||
subjectToAiAct: data.subject_to_ai_act || false,
|
||||
subjectToIso27001: data.subject_to_iso27001 || false,
|
||||
supervisoryAuthority: data.supervisory_authority || '',
|
||||
reviewCycleMonths: data.review_cycle_months || 12,
|
||||
repos: data.repos || [],
|
||||
documentSources: data.document_sources || [],
|
||||
} as any
|
||||
setFormData(backendProfile)
|
||||
if (backendProfile.isComplete) {
|
||||
setCurrentStep(5)
|
||||
setCurrentStep(7)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1181,7 +1529,7 @@ export default function CompanyProfilePage() {
|
||||
if (!cancelled && state.companyProfile) {
|
||||
setFormData(state.companyProfile)
|
||||
if (state.companyProfile.isComplete) {
|
||||
setCurrentStep(5)
|
||||
setCurrentStep(7)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1198,10 +1546,10 @@ export default function CompanyProfilePage() {
|
||||
|
||||
const handleNext = () => {
|
||||
if (currentStep < lastStep) {
|
||||
// Skip step 6 if not a machine builder
|
||||
// Skip step 8 if not a machine builder
|
||||
const nextStep = currentStep + 1
|
||||
if (nextStep === 6 && !showMachineBuilderStep) {
|
||||
// Complete profile (was step 5, last step for non-machine-builders)
|
||||
if (nextStep === 8 && !showMachineBuilderStep) {
|
||||
// Complete profile (was step 7, last step for non-machine-builders)
|
||||
completeAndSaveProfile()
|
||||
return
|
||||
}
|
||||
@@ -1250,6 +1598,17 @@ export default function CompanyProfilePage() {
|
||||
dpo_name: formData.dpoName || '',
|
||||
dpo_email: formData.dpoEmail || '',
|
||||
is_complete: true,
|
||||
// Phase 2 extended fields
|
||||
processing_systems: (formData as any).processingSystems || [],
|
||||
ai_systems: (formData as any).aiSystems || [],
|
||||
technical_contacts: (formData as any).technicalContacts || [],
|
||||
subject_to_nis2: (formData as any).subjectToNis2 || false,
|
||||
subject_to_ai_act: (formData as any).subjectToAiAct || false,
|
||||
subject_to_iso27001: (formData as any).subjectToIso27001 || false,
|
||||
supervisory_authority: (formData as any).supervisoryAuthority || '',
|
||||
review_cycle_months: (formData as any).reviewCycleMonths || 12,
|
||||
repos: (formData as any).repos || [],
|
||||
document_sources: (formData as any).documentSources || [],
|
||||
// Machine builder fields (if applicable)
|
||||
...(formData.machineBuilder ? {
|
||||
machine_builder: {
|
||||
@@ -1351,6 +1710,10 @@ export default function CompanyProfilePage() {
|
||||
case 5:
|
||||
return true
|
||||
case 6:
|
||||
return true // Systems & AI step is optional
|
||||
case 7:
|
||||
return true // Legal framework step is optional
|
||||
case 8:
|
||||
// Machine builder step: require at least product description
|
||||
return (formData.machineBuilder?.productDescription?.length || 0) > 0
|
||||
default:
|
||||
@@ -1358,7 +1721,7 @@ export default function CompanyProfilePage() {
|
||||
}
|
||||
}
|
||||
|
||||
const isLastStep = currentStep === lastStep || (currentStep === 5 && !showMachineBuilderStep)
|
||||
const isLastStep = currentStep === lastStep || (currentStep === 7 && !showMachineBuilderStep)
|
||||
|
||||
return (
|
||||
<div className="min-h-screen bg-gray-50 py-8">
|
||||
@@ -1438,7 +1801,9 @@ export default function CompanyProfilePage() {
|
||||
{currentStep === 3 && <StepCompanySize data={formData} onChange={updateFormData} />}
|
||||
{currentStep === 4 && <StepLocations data={formData} onChange={updateFormData} />}
|
||||
{currentStep === 5 && <StepDataProtection data={formData} onChange={updateFormData} />}
|
||||
{currentStep === 6 && showMachineBuilderStep && <StepMachineBuilder data={formData} onChange={updateFormData} />}
|
||||
{currentStep === 6 && <StepSystemsAndAI data={formData} onChange={updateFormData} />}
|
||||
{currentStep === 7 && <StepLegalFramework data={formData} onChange={updateFormData} />}
|
||||
{currentStep === 8 && showMachineBuilderStep && <StepMachineBuilder data={formData} onChange={updateFormData} />}
|
||||
|
||||
{/* Navigation */}
|
||||
<div className="flex justify-between mt-8 pt-6 border-t border-gray-200">
|
||||
@@ -1517,6 +1882,17 @@ export default function CompanyProfilePage() {
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/* Generate Documents CTA */}
|
||||
{formData.isComplete && (
|
||||
<div className="mt-6 bg-gradient-to-br from-purple-50 to-indigo-50 rounded-xl border border-purple-200 p-6">
|
||||
<h3 className="font-semibold text-purple-900 mb-2">Dokumente generieren</h3>
|
||||
<p className="text-sm text-purple-700 mb-4">
|
||||
Basierend auf Ihrem Profil können DSFA, VVT, TOM, Löschfristen und Pflichten automatisch als Entwürfe generiert werden.
|
||||
</p>
|
||||
<GenerateDocumentsButton />
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Delete Profile Button */}
|
||||
{formData.companyName && (
|
||||
<div className="mt-6">
|
||||
|
||||
@@ -342,6 +342,25 @@ function CorpusStalenessInfo({ ragCorpusStatus }: { ragCorpusStatus: RAGCorpusSt
|
||||
export function SDKSidebar({ collapsed = false, onCollapsedChange }: SDKSidebarProps) {
|
||||
const pathname = usePathname()
|
||||
const { state, packageCompletion, completionPercentage, getCheckpointStatus } = useSDK()
|
||||
const [pendingCRCount, setPendingCRCount] = React.useState(0)
|
||||
|
||||
// Poll pending change-request count every 60s
|
||||
React.useEffect(() => {
|
||||
let active = true
|
||||
async function fetchCRCount() {
|
||||
try {
|
||||
const res = await fetch('/api/sdk/v1/compliance/change-requests/stats')
|
||||
if (res.ok) {
|
||||
const data = await res.json()
|
||||
if (active) setPendingCRCount(data.total_pending || 0)
|
||||
}
|
||||
} catch { /* ignore */ }
|
||||
}
|
||||
fetchCRCount()
|
||||
const interval = setInterval(fetchCRCount, 60000)
|
||||
return () => { active = false; clearInterval(interval) }
|
||||
}, [])
|
||||
|
||||
const [expandedPackages, setExpandedPackages] = React.useState<Record<SDKPackageId, boolean>>({
|
||||
'vorbereitung': true,
|
||||
'analyse': false,
|
||||
@@ -695,6 +714,35 @@ export function SDKSidebar({ collapsed = false, onCollapsedChange }: SDKSidebarP
|
||||
isActive={pathname === '/sdk/catalog-manager'}
|
||||
collapsed={collapsed}
|
||||
/>
|
||||
<Link
|
||||
href="/sdk/change-requests"
|
||||
className={`flex items-center gap-3 px-4 py-2.5 text-sm transition-colors ${
|
||||
collapsed ? 'justify-center' : ''
|
||||
} ${
|
||||
pathname === '/sdk/change-requests'
|
||||
? 'bg-purple-100 text-purple-900 font-medium'
|
||||
: 'text-gray-600 hover:bg-gray-50 hover:text-gray-900'
|
||||
}`}
|
||||
title={collapsed ? `Änderungsanfragen (${pendingCRCount})` : undefined}
|
||||
>
|
||||
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2}
|
||||
d="M9 5H7a2 2 0 00-2 2v12a2 2 0 002 2h10a2 2 0 002-2V7a2 2 0 00-2-2h-2M9 5a2 2 0 002 2h2a2 2 0 002-2M9 5a2 2 0 012-2h2a2 2 0 012 2m-3 7h3m-3 4h3m-6-4h.01M9 16h.01" />
|
||||
</svg>
|
||||
{!collapsed && (
|
||||
<span className="flex items-center gap-2">
|
||||
Änderungsanfragen
|
||||
{pendingCRCount > 0 && (
|
||||
<span className="px-1.5 py-0.5 text-xs font-bold bg-red-500 text-white rounded-full min-w-[1.25rem] text-center">
|
||||
{pendingCRCount}
|
||||
</span>
|
||||
)}
|
||||
</span>
|
||||
)}
|
||||
{collapsed && pendingCRCount > 0 && (
|
||||
<span className="absolute top-1 right-1 w-2 h-2 bg-red-500 rounded-full" />
|
||||
)}
|
||||
</Link>
|
||||
<AdditionalModuleItem
|
||||
href="https://macmini:3006"
|
||||
icon={
|
||||
|
||||
110
admin-compliance/components/sdk/VersionHistory.tsx
Normal file
110
admin-compliance/components/sdk/VersionHistory.tsx
Normal file
@@ -0,0 +1,110 @@
|
||||
'use client'
|
||||
|
||||
import { useState, useEffect } from 'react'
|
||||
|
||||
interface Version {
|
||||
id: string
|
||||
versionNumber: number
|
||||
status: string
|
||||
changeSummary: string | null
|
||||
changedSections: string[]
|
||||
createdBy: string
|
||||
approvedBy: string | null
|
||||
approvedAt: string | null
|
||||
createdAt: string | null
|
||||
}
|
||||
|
||||
interface VersionHistoryProps {
|
||||
documentType: 'dsfa' | 'vvt' | 'tom' | 'loeschfristen' | 'obligation'
|
||||
documentId: string
|
||||
apiPath: string
|
||||
}
|
||||
|
||||
export default function VersionHistory({ documentType, documentId, apiPath }: VersionHistoryProps) {
|
||||
const [versions, setVersions] = useState<Version[]>([])
|
||||
const [loading, setLoading] = useState(true)
|
||||
const [expanded, setExpanded] = useState(false)
|
||||
|
||||
useEffect(() => {
|
||||
async function load() {
|
||||
try {
|
||||
const res = await fetch(`/api/sdk/v1/compliance/${apiPath}`)
|
||||
if (res.ok) {
|
||||
const data = await res.json()
|
||||
setVersions(
|
||||
(Array.isArray(data) ? data : []).map((v: Record<string, unknown>) => ({
|
||||
id: v.id as string,
|
||||
versionNumber: v.version_number as number,
|
||||
status: v.status as string,
|
||||
changeSummary: v.change_summary as string | null,
|
||||
changedSections: (v.changed_sections || []) as string[],
|
||||
createdBy: v.created_by as string,
|
||||
approvedBy: v.approved_by as string | null,
|
||||
approvedAt: v.approved_at as string | null,
|
||||
createdAt: v.created_at as string | null,
|
||||
}))
|
||||
)
|
||||
}
|
||||
} catch (e) {
|
||||
console.error('Failed to load versions:', e)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
}
|
||||
if (documentId) load()
|
||||
}, [documentId, apiPath])
|
||||
|
||||
if (loading) return null
|
||||
if (versions.length === 0) return null
|
||||
|
||||
return (
|
||||
<div className="mt-4">
|
||||
<button
|
||||
onClick={() => setExpanded(!expanded)}
|
||||
className="flex items-center gap-2 text-sm text-purple-600 hover:text-purple-800 font-medium"
|
||||
>
|
||||
<svg className={`w-4 h-4 transition-transform ${expanded ? 'rotate-90' : ''}`} fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5l7 7-7 7" />
|
||||
</svg>
|
||||
Versionen ({versions.length})
|
||||
</button>
|
||||
|
||||
{expanded && (
|
||||
<div className="mt-3 border-l-2 border-purple-200 pl-4 space-y-3">
|
||||
{versions.map(v => (
|
||||
<div key={v.id} className="relative">
|
||||
<div className="absolute -left-[1.35rem] top-1.5 w-2.5 h-2.5 rounded-full bg-purple-400 border-2 border-white" />
|
||||
<div className="text-sm">
|
||||
<div className="flex items-center gap-2">
|
||||
<span className="font-medium text-gray-900">v{v.versionNumber}</span>
|
||||
<span className={`px-1.5 py-0.5 rounded text-xs ${
|
||||
v.status === 'approved' ? 'bg-green-100 text-green-700' :
|
||||
v.status === 'draft' ? 'bg-yellow-100 text-yellow-700' :
|
||||
'bg-gray-100 text-gray-600'
|
||||
}`}>
|
||||
{v.status}
|
||||
</span>
|
||||
{v.createdAt && (
|
||||
<span className="text-gray-400 text-xs">
|
||||
{new Date(v.createdAt).toLocaleDateString('de-DE')}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{v.changeSummary && (
|
||||
<p className="text-gray-600 text-xs mt-0.5">{v.changeSummary}</p>
|
||||
)}
|
||||
{v.changedSections.length > 0 && (
|
||||
<div className="flex gap-1 mt-1">
|
||||
{v.changedSections.map((s, i) => (
|
||||
<span key={i} className="px-1.5 py-0.5 bg-purple-50 text-purple-600 rounded text-xs">{s}</span>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
@@ -29,6 +29,8 @@ from .extraction_routes import router as extraction_router
|
||||
from .tom_routes import router as tom_router
|
||||
from .vendor_compliance_routes import router as vendor_compliance_router
|
||||
from .incident_routes import router as incident_router
|
||||
from .change_request_routes import router as change_request_router
|
||||
from .generation_routes import router as generation_router
|
||||
|
||||
# Include sub-routers
|
||||
router.include_router(audit_router)
|
||||
@@ -59,6 +61,8 @@ router.include_router(extraction_router)
|
||||
router.include_router(tom_router)
|
||||
router.include_router(vendor_compliance_router)
|
||||
router.include_router(incident_router)
|
||||
router.include_router(change_request_router)
|
||||
router.include_router(generation_router)
|
||||
|
||||
__all__ = [
|
||||
"router",
|
||||
@@ -89,4 +93,6 @@ __all__ = [
|
||||
"tom_router",
|
||||
"vendor_compliance_router",
|
||||
"incident_router",
|
||||
"change_request_router",
|
||||
"generation_router",
|
||||
]
|
||||
|
||||
181
backend-compliance/compliance/api/change_request_engine.py
Normal file
181
backend-compliance/compliance/api/change_request_engine.py
Normal file
@@ -0,0 +1,181 @@
|
||||
"""
|
||||
Change-Request Engine — Regelbasierte Generierung von Change-Requests.
|
||||
|
||||
Generates change requests when compliance-relevant events occur:
|
||||
- New high-risk use case → "DSFA erstellen"
|
||||
- AI involvement → "DSFA Section 3/8 aktualisieren"
|
||||
- New data categories → "VVT-Eintrag anlegen"
|
||||
- VVT dpia_required toggle → "DSFA erstellen"
|
||||
- New retention requirement → "Loeschfrist anlegen"
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from typing import List, Optional
|
||||
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def generate_change_requests_for_vvt(
|
||||
db: Session,
|
||||
tenant_id: str,
|
||||
activity_data: dict,
|
||||
created_by: str = "system",
|
||||
) -> List[str]:
|
||||
"""Generate CRs when a VVT activity is created or updated.
|
||||
|
||||
Returns list of created CR IDs.
|
||||
"""
|
||||
cr_ids = []
|
||||
|
||||
# Rule 1: dpia_required=true → suggest DSFA
|
||||
if activity_data.get("dpia_required"):
|
||||
cr_id = _create_cr(
|
||||
db, tenant_id,
|
||||
trigger_type="vvt_dpia_required",
|
||||
target_document_type="dsfa",
|
||||
proposal_title=f"DSFA erstellen für VVT-Aktivität '{activity_data.get('name', 'Unbenannt')}'",
|
||||
proposal_body=f"Die VVT-Aktivität '{activity_data.get('name')}' wurde als DSFA-pflichtig markiert. "
|
||||
f"Eine Datenschutz-Folgenabschätzung nach Art. 35 DSGVO ist erforderlich.",
|
||||
proposed_changes={
|
||||
"source": "vvt_activity",
|
||||
"activity_name": activity_data.get("name"),
|
||||
"activity_vvt_id": activity_data.get("vvt_id"),
|
||||
},
|
||||
priority="high",
|
||||
created_by=created_by,
|
||||
)
|
||||
if cr_id:
|
||||
cr_ids.append(cr_id)
|
||||
|
||||
# Rule 2: New data categories → suggest Loeschfrist
|
||||
categories = activity_data.get("personal_data_categories", [])
|
||||
if categories:
|
||||
cr_id = _create_cr(
|
||||
db, tenant_id,
|
||||
trigger_type="vvt_data_categories",
|
||||
target_document_type="loeschfristen",
|
||||
proposal_title=f"Löschfrist für {len(categories)} Datenkategorie(n) prüfen",
|
||||
proposal_body=f"Die VVT-Aktivität '{activity_data.get('name')}' verarbeitet folgende Datenkategorien: "
|
||||
f"{', '.join(categories)}. Prüfen Sie, ob Löschfristen definiert sind.",
|
||||
proposed_changes={
|
||||
"source": "vvt_activity",
|
||||
"categories": categories,
|
||||
},
|
||||
priority="normal",
|
||||
created_by=created_by,
|
||||
)
|
||||
if cr_id:
|
||||
cr_ids.append(cr_id)
|
||||
|
||||
return cr_ids
|
||||
|
||||
|
||||
def generate_change_requests_for_use_case(
|
||||
db: Session,
|
||||
tenant_id: str,
|
||||
use_case_data: dict,
|
||||
created_by: str = "system",
|
||||
) -> List[str]:
|
||||
"""Generate CRs when a high-risk or AI use case is created.
|
||||
|
||||
Returns list of created CR IDs.
|
||||
"""
|
||||
cr_ids = []
|
||||
risk_level = use_case_data.get("risk_level", "low")
|
||||
involves_ai = use_case_data.get("involves_ai", False)
|
||||
title = use_case_data.get("title", "Unbenannt")
|
||||
|
||||
# Rule: High-risk use case → DSFA
|
||||
if risk_level in ("high", "critical"):
|
||||
cr_id = _create_cr(
|
||||
db, tenant_id,
|
||||
trigger_type="use_case_high_risk",
|
||||
target_document_type="dsfa",
|
||||
proposal_title=f"DSFA erstellen für '{title}' (Risiko: {risk_level})",
|
||||
proposal_body=f"Ein neuer Use Case mit hohem Risiko wurde erstellt. "
|
||||
f"Art. 35 DSGVO verlangt eine DSFA für Hochrisiko-Verarbeitungen.",
|
||||
proposed_changes={
|
||||
"source": "use_case",
|
||||
"title": title,
|
||||
"risk_level": risk_level,
|
||||
},
|
||||
priority="critical" if risk_level == "critical" else "high",
|
||||
created_by=created_by,
|
||||
)
|
||||
if cr_id:
|
||||
cr_ids.append(cr_id)
|
||||
|
||||
# Rule: AI involvement → DSFA section update
|
||||
if involves_ai:
|
||||
cr_id = _create_cr(
|
||||
db, tenant_id,
|
||||
trigger_type="use_case_ai",
|
||||
target_document_type="dsfa",
|
||||
target_section="section_3",
|
||||
proposal_title=f"DSFA Sektion 3/8 aktualisieren (KI in '{title}')",
|
||||
proposal_body=f"Der Use Case '{title}' nutzt KI-Systeme. "
|
||||
f"Die DSFA-Risikoanalyse (Sektion 3) und Maßnahmen (Sektion 8) müssen aktualisiert werden.",
|
||||
proposed_changes={
|
||||
"source": "use_case",
|
||||
"title": title,
|
||||
"involves_ai": True,
|
||||
},
|
||||
priority="high",
|
||||
created_by=created_by,
|
||||
)
|
||||
if cr_id:
|
||||
cr_ids.append(cr_id)
|
||||
|
||||
return cr_ids
|
||||
|
||||
|
||||
def _create_cr(
|
||||
db: Session,
|
||||
tenant_id: str,
|
||||
trigger_type: str,
|
||||
target_document_type: str,
|
||||
proposal_title: str,
|
||||
proposal_body: str = "",
|
||||
proposed_changes: dict = None,
|
||||
priority: str = "normal",
|
||||
target_document_id: str = None,
|
||||
target_section: str = None,
|
||||
trigger_source_id: str = None,
|
||||
created_by: str = "system",
|
||||
) -> Optional[str]:
|
||||
"""Internal helper to insert a change request."""
|
||||
try:
|
||||
result = db.execute(
|
||||
text("""
|
||||
INSERT INTO compliance_change_requests
|
||||
(tenant_id, trigger_type, trigger_source_id, target_document_type,
|
||||
target_document_id, target_section, proposal_title, proposal_body,
|
||||
proposed_changes, priority, created_by)
|
||||
VALUES (:tid, :trigger, :source_id, :doc_type,
|
||||
:doc_id, :section, :title, :body,
|
||||
CAST(:changes AS jsonb), :priority, :by)
|
||||
RETURNING id
|
||||
"""),
|
||||
{
|
||||
"tid": tenant_id,
|
||||
"trigger": trigger_type,
|
||||
"source_id": trigger_source_id,
|
||||
"doc_type": target_document_type,
|
||||
"doc_id": target_document_id,
|
||||
"section": target_section,
|
||||
"title": proposal_title,
|
||||
"body": proposal_body,
|
||||
"changes": json.dumps(proposed_changes or {}),
|
||||
"priority": priority,
|
||||
"by": created_by,
|
||||
},
|
||||
)
|
||||
row = result.fetchone()
|
||||
return str(row[0]) if row else None
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create change request: {e}")
|
||||
return None
|
||||
427
backend-compliance/compliance/api/change_request_routes.py
Normal file
427
backend-compliance/compliance/api/change_request_routes.py
Normal file
@@ -0,0 +1,427 @@
|
||||
"""
|
||||
FastAPI routes for Change-Request System.
|
||||
|
||||
Endpoints:
|
||||
GET /change-requests — List (filter: status, doc_type, priority)
|
||||
GET /change-requests/stats — Summary counts
|
||||
GET /change-requests/{id} — Detail + audit log
|
||||
POST /change-requests — Create manually
|
||||
POST /change-requests/{id}/accept — Accept → create new version
|
||||
POST /change-requests/{id}/reject — Reject with reason
|
||||
POST /change-requests/{id}/edit — Edit proposal, then accept
|
||||
DELETE /change-requests/{id} — Soft-delete
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Optional, List
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Query, Header
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from classroom_engine.database import get_db
|
||||
from .tenant_utils import get_tenant_id
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/change-requests", tags=["change-requests"])
|
||||
|
||||
VALID_STATUSES = {"pending", "accepted", "rejected", "edited_and_accepted"}
|
||||
VALID_PRIORITIES = {"low", "normal", "high", "critical"}
|
||||
VALID_DOC_TYPES = {"dsfa", "vvt", "tom", "loeschfristen", "obligation"}
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Pydantic Schemas
|
||||
# =============================================================================
|
||||
|
||||
class ChangeRequestCreate(BaseModel):
|
||||
trigger_type: str = "manual"
|
||||
trigger_source_id: Optional[str] = None
|
||||
target_document_type: str
|
||||
target_document_id: Optional[str] = None
|
||||
target_section: Optional[str] = None
|
||||
proposal_title: str
|
||||
proposal_body: Optional[str] = None
|
||||
proposed_changes: dict = {}
|
||||
priority: str = "normal"
|
||||
|
||||
|
||||
class ChangeRequestEdit(BaseModel):
|
||||
proposal_body: Optional[str] = None
|
||||
proposed_changes: Optional[dict] = None
|
||||
|
||||
|
||||
class ChangeRequestReject(BaseModel):
|
||||
rejection_reason: str
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Helpers
|
||||
# =============================================================================
|
||||
|
||||
def _cr_to_dict(row) -> dict:
|
||||
return {
|
||||
"id": str(row["id"]),
|
||||
"tenant_id": row["tenant_id"],
|
||||
"trigger_type": row["trigger_type"],
|
||||
"trigger_source_id": str(row["trigger_source_id"]) if row["trigger_source_id"] else None,
|
||||
"target_document_type": row["target_document_type"],
|
||||
"target_document_id": str(row["target_document_id"]) if row["target_document_id"] else None,
|
||||
"target_section": row["target_section"],
|
||||
"proposal_title": row["proposal_title"],
|
||||
"proposal_body": row["proposal_body"],
|
||||
"proposed_changes": row["proposed_changes"] or {},
|
||||
"status": row["status"],
|
||||
"priority": row["priority"],
|
||||
"decided_by": row["decided_by"],
|
||||
"decided_at": row["decided_at"].isoformat() if row["decided_at"] else None,
|
||||
"rejection_reason": row["rejection_reason"],
|
||||
"resulting_version_id": str(row["resulting_version_id"]) if row["resulting_version_id"] else None,
|
||||
"created_by": row["created_by"],
|
||||
"created_at": row["created_at"].isoformat() if row["created_at"] else None,
|
||||
"updated_at": row["updated_at"].isoformat() if row["updated_at"] else None,
|
||||
}
|
||||
|
||||
|
||||
def _log_cr_audit(db, cr_id, tenant_id, action, performed_by="system", before_state=None, after_state=None):
|
||||
db.execute(
|
||||
text("""
|
||||
INSERT INTO compliance_change_request_audit
|
||||
(change_request_id, tenant_id, action, performed_by, before_state, after_state)
|
||||
VALUES (:cr_id, :tid, :action, :by, CAST(:before AS jsonb), CAST(:after AS jsonb))
|
||||
"""),
|
||||
{
|
||||
"cr_id": cr_id,
|
||||
"tid": tenant_id,
|
||||
"action": action,
|
||||
"by": performed_by,
|
||||
"before": json.dumps(before_state) if before_state else None,
|
||||
"after": json.dumps(after_state) if after_state else None,
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Routes
|
||||
# =============================================================================
|
||||
|
||||
@router.get("")
|
||||
async def list_change_requests(
|
||||
status: Optional[str] = Query(None),
|
||||
target_document_type: Optional[str] = Query(None),
|
||||
priority: Optional[str] = Query(None),
|
||||
skip: int = Query(0, ge=0),
|
||||
limit: int = Query(100, ge=1, le=500),
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""List change requests with optional filters."""
|
||||
sql = "SELECT * FROM compliance_change_requests WHERE tenant_id = :tid AND NOT is_deleted"
|
||||
params = {"tid": tid}
|
||||
|
||||
if status:
|
||||
sql += " AND status = :status"
|
||||
params["status"] = status
|
||||
if target_document_type:
|
||||
sql += " AND target_document_type = :doc_type"
|
||||
params["doc_type"] = target_document_type
|
||||
if priority:
|
||||
sql += " AND priority = :priority"
|
||||
params["priority"] = priority
|
||||
|
||||
sql += " ORDER BY CASE priority WHEN 'critical' THEN 0 WHEN 'high' THEN 1 WHEN 'normal' THEN 2 ELSE 3 END, created_at DESC"
|
||||
sql += " LIMIT :limit OFFSET :skip"
|
||||
params["limit"] = limit
|
||||
params["skip"] = skip
|
||||
|
||||
rows = db.execute(text(sql), params).fetchall()
|
||||
return [_cr_to_dict(r) for r in rows]
|
||||
|
||||
|
||||
@router.get("/stats")
|
||||
async def get_stats(
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Summary counts for change requests."""
|
||||
rows = db.execute(
|
||||
text("""
|
||||
SELECT
|
||||
COUNT(*) FILTER (WHERE status = 'pending') AS total_pending,
|
||||
COUNT(*) FILTER (WHERE status = 'pending' AND priority = 'critical') AS critical_count,
|
||||
COUNT(*) FILTER (WHERE status = 'accepted' OR status = 'edited_and_accepted') AS total_accepted,
|
||||
COUNT(*) FILTER (WHERE status = 'rejected') AS total_rejected
|
||||
FROM compliance_change_requests
|
||||
WHERE tenant_id = :tid AND NOT is_deleted
|
||||
"""),
|
||||
{"tid": tid},
|
||||
).fetchone()
|
||||
|
||||
# By document type
|
||||
doc_type_rows = db.execute(
|
||||
text("""
|
||||
SELECT target_document_type, COUNT(*)
|
||||
FROM compliance_change_requests
|
||||
WHERE tenant_id = :tid AND status = 'pending' AND NOT is_deleted
|
||||
GROUP BY target_document_type
|
||||
"""),
|
||||
{"tid": tid},
|
||||
).fetchall()
|
||||
|
||||
return {
|
||||
"total_pending": rows[0] or 0,
|
||||
"critical_count": rows[1] or 0,
|
||||
"total_accepted": rows[2] or 0,
|
||||
"total_rejected": rows[3] or 0,
|
||||
"by_document_type": {r[0]: r[1] for r in doc_type_rows},
|
||||
}
|
||||
|
||||
|
||||
@router.get("/{cr_id}")
|
||||
async def get_change_request(
|
||||
cr_id: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get change request detail with audit log."""
|
||||
row = db.execute(
|
||||
text("SELECT * FROM compliance_change_requests WHERE id = :id AND tenant_id = :tid AND NOT is_deleted"),
|
||||
{"id": cr_id, "tid": tid},
|
||||
).fetchone()
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Change request not found")
|
||||
|
||||
cr = _cr_to_dict(row)
|
||||
|
||||
# Attach audit log
|
||||
audit_rows = db.execute(
|
||||
text("""
|
||||
SELECT id, action, performed_by, before_state, after_state, created_at
|
||||
FROM compliance_change_request_audit
|
||||
WHERE change_request_id = :cr_id
|
||||
ORDER BY created_at DESC
|
||||
"""),
|
||||
{"cr_id": cr_id},
|
||||
).fetchall()
|
||||
|
||||
cr["audit_log"] = [
|
||||
{
|
||||
"id": str(a[0]),
|
||||
"action": a[1],
|
||||
"performed_by": a[2],
|
||||
"before_state": a[3],
|
||||
"after_state": a[4],
|
||||
"created_at": a[5].isoformat() if a[5] else None,
|
||||
}
|
||||
for a in audit_rows
|
||||
]
|
||||
return cr
|
||||
|
||||
|
||||
@router.post("", status_code=201)
|
||||
async def create_change_request(
|
||||
body: ChangeRequestCreate,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
x_user_id: Optional[str] = Header(None, alias="X-User-ID"),
|
||||
):
|
||||
"""Create a change request manually."""
|
||||
if body.target_document_type not in VALID_DOC_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid target_document_type: {body.target_document_type}")
|
||||
if body.priority not in VALID_PRIORITIES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid priority: {body.priority}")
|
||||
|
||||
row = db.execute(
|
||||
text("""
|
||||
INSERT INTO compliance_change_requests
|
||||
(tenant_id, trigger_type, trigger_source_id, target_document_type,
|
||||
target_document_id, target_section, proposal_title, proposal_body,
|
||||
proposed_changes, priority, created_by)
|
||||
VALUES (:tid, :trigger_type, :trigger_source_id, :doc_type,
|
||||
:doc_id, :section, :title, :body,
|
||||
CAST(:changes AS jsonb), :priority, :created_by)
|
||||
RETURNING *
|
||||
"""),
|
||||
{
|
||||
"tid": tid,
|
||||
"trigger_type": body.trigger_type,
|
||||
"trigger_source_id": body.trigger_source_id,
|
||||
"doc_type": body.target_document_type,
|
||||
"doc_id": body.target_document_id,
|
||||
"section": body.target_section,
|
||||
"title": body.proposal_title,
|
||||
"body": body.proposal_body,
|
||||
"changes": json.dumps(body.proposed_changes),
|
||||
"priority": body.priority,
|
||||
"created_by": x_user_id or "system",
|
||||
},
|
||||
).fetchone()
|
||||
|
||||
_log_cr_audit(db, row["id"], tid, "CREATED", x_user_id or "system")
|
||||
db.commit()
|
||||
|
||||
return _cr_to_dict(row)
|
||||
|
||||
|
||||
@router.post("/{cr_id}/accept")
|
||||
async def accept_change_request(
|
||||
cr_id: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
x_user_id: Optional[str] = Header(None, alias="X-User-ID"),
|
||||
):
|
||||
"""Accept a change request → creates a new document version."""
|
||||
row = db.execute(
|
||||
text("SELECT * FROM compliance_change_requests WHERE id = :id AND tenant_id = :tid AND NOT is_deleted"),
|
||||
{"id": cr_id, "tid": tid},
|
||||
).fetchone()
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Change request not found")
|
||||
if row["status"] != "pending":
|
||||
raise HTTPException(status_code=422, detail=f"Cannot accept CR in status '{row['status']}'")
|
||||
|
||||
user = x_user_id or "system"
|
||||
before_state = {"status": row["status"]}
|
||||
|
||||
# If there's a target document, create a version snapshot
|
||||
resulting_version_id = None
|
||||
if row["target_document_id"]:
|
||||
try:
|
||||
from .versioning_utils import create_version_snapshot
|
||||
version = create_version_snapshot(
|
||||
db,
|
||||
doc_type=row["target_document_type"],
|
||||
doc_id=str(row["target_document_id"]),
|
||||
tenant_id=tid,
|
||||
snapshot=row["proposed_changes"] or {},
|
||||
change_summary=f"Accepted CR: {row['proposal_title']}",
|
||||
created_by=user,
|
||||
)
|
||||
resulting_version_id = version["id"]
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not create version for CR {cr_id}: {e}")
|
||||
|
||||
# Update CR status
|
||||
updated = db.execute(
|
||||
text("""
|
||||
UPDATE compliance_change_requests
|
||||
SET status = 'accepted', decided_by = :user, decided_at = NOW(),
|
||||
resulting_version_id = :ver_id, updated_at = NOW()
|
||||
WHERE id = :id AND tenant_id = :tid
|
||||
RETURNING *
|
||||
"""),
|
||||
{"id": cr_id, "tid": tid, "user": user, "ver_id": resulting_version_id},
|
||||
).fetchone()
|
||||
|
||||
_log_cr_audit(db, cr_id, tid, "ACCEPTED", user, before_state, {"status": "accepted"})
|
||||
db.commit()
|
||||
|
||||
return _cr_to_dict(updated)
|
||||
|
||||
|
||||
@router.post("/{cr_id}/reject")
|
||||
async def reject_change_request(
|
||||
cr_id: str,
|
||||
body: ChangeRequestReject,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
x_user_id: Optional[str] = Header(None, alias="X-User-ID"),
|
||||
):
|
||||
"""Reject a change request with reason."""
|
||||
row = db.execute(
|
||||
text("SELECT * FROM compliance_change_requests WHERE id = :id AND tenant_id = :tid AND NOT is_deleted"),
|
||||
{"id": cr_id, "tid": tid},
|
||||
).fetchone()
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Change request not found")
|
||||
if row["status"] != "pending":
|
||||
raise HTTPException(status_code=422, detail=f"Cannot reject CR in status '{row['status']}'")
|
||||
|
||||
user = x_user_id or "system"
|
||||
|
||||
updated = db.execute(
|
||||
text("""
|
||||
UPDATE compliance_change_requests
|
||||
SET status = 'rejected', decided_by = :user, decided_at = NOW(),
|
||||
rejection_reason = :reason, updated_at = NOW()
|
||||
WHERE id = :id AND tenant_id = :tid
|
||||
RETURNING *
|
||||
"""),
|
||||
{"id": cr_id, "tid": tid, "user": user, "reason": body.rejection_reason},
|
||||
).fetchone()
|
||||
|
||||
_log_cr_audit(db, cr_id, tid, "REJECTED", user, {"status": "pending"}, {"status": "rejected", "reason": body.rejection_reason})
|
||||
db.commit()
|
||||
|
||||
return _cr_to_dict(updated)
|
||||
|
||||
|
||||
@router.post("/{cr_id}/edit")
|
||||
async def edit_change_request(
|
||||
cr_id: str,
|
||||
body: ChangeRequestEdit,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
x_user_id: Optional[str] = Header(None, alias="X-User-ID"),
|
||||
):
|
||||
"""Edit the proposal, then auto-accept."""
|
||||
row = db.execute(
|
||||
text("SELECT * FROM compliance_change_requests WHERE id = :id AND tenant_id = :tid AND NOT is_deleted"),
|
||||
{"id": cr_id, "tid": tid},
|
||||
).fetchone()
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Change request not found")
|
||||
if row["status"] != "pending":
|
||||
raise HTTPException(status_code=422, detail=f"Cannot edit CR in status '{row['status']}'")
|
||||
|
||||
user = x_user_id or "system"
|
||||
updates = []
|
||||
params = {"id": cr_id, "tid": tid, "user": user}
|
||||
|
||||
if body.proposal_body is not None:
|
||||
updates.append("proposal_body = :body")
|
||||
params["body"] = body.proposal_body
|
||||
if body.proposed_changes is not None:
|
||||
updates.append("proposed_changes = CAST(:changes AS jsonb)")
|
||||
params["changes"] = json.dumps(body.proposed_changes)
|
||||
|
||||
updates.append("status = 'edited_and_accepted'")
|
||||
updates.append("decided_by = :user")
|
||||
updates.append("decided_at = NOW()")
|
||||
updates.append("updated_at = NOW()")
|
||||
|
||||
sql = f"UPDATE compliance_change_requests SET {', '.join(updates)} WHERE id = :id AND tenant_id = :tid RETURNING *"
|
||||
updated = db.execute(text(sql), params).fetchone()
|
||||
|
||||
_log_cr_audit(db, cr_id, tid, "EDITED_AND_ACCEPTED", user, {"status": "pending"}, {"status": "edited_and_accepted"})
|
||||
db.commit()
|
||||
|
||||
return _cr_to_dict(updated)
|
||||
|
||||
|
||||
@router.delete("/{cr_id}")
|
||||
async def delete_change_request(
|
||||
cr_id: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
x_user_id: Optional[str] = Header(None, alias="X-User-ID"),
|
||||
):
|
||||
"""Soft-delete a change request."""
|
||||
result = db.execute(
|
||||
text("""
|
||||
UPDATE compliance_change_requests
|
||||
SET is_deleted = TRUE, updated_at = NOW()
|
||||
WHERE id = :id AND tenant_id = :tid AND NOT is_deleted
|
||||
"""),
|
||||
{"id": cr_id, "tid": tid},
|
||||
)
|
||||
if result.rowcount == 0:
|
||||
raise HTTPException(status_code=404, detail="Change request not found")
|
||||
|
||||
_log_cr_audit(db, cr_id, tid, "DELETED", x_user_id or "system")
|
||||
db.commit()
|
||||
|
||||
return {"success": True, "message": "Change request deleted"}
|
||||
@@ -4,7 +4,9 @@ FastAPI routes for Company Profile CRUD with audit logging.
|
||||
Endpoints:
|
||||
- GET /v1/company-profile: Get company profile for a tenant
|
||||
- POST /v1/company-profile: Create or update company profile
|
||||
- DELETE /v1/company-profile: Delete company profile
|
||||
- GET /v1/company-profile/audit: Get audit log for a tenant
|
||||
- GET /v1/company-profile/template-context: Flat dict for template substitution
|
||||
"""
|
||||
|
||||
import json
|
||||
@@ -51,6 +53,17 @@ class CompanyProfileRequest(BaseModel):
|
||||
legal_contact_email: Optional[str] = None
|
||||
machine_builder: Optional[dict] = None
|
||||
is_complete: bool = False
|
||||
# Phase 2 fields
|
||||
repos: list[dict] = []
|
||||
document_sources: list[dict] = []
|
||||
processing_systems: list[dict] = []
|
||||
ai_systems: list[dict] = []
|
||||
technical_contacts: list[dict] = []
|
||||
subject_to_nis2: bool = False
|
||||
subject_to_ai_act: bool = False
|
||||
subject_to_iso27001: bool = False
|
||||
supervisory_authority: Optional[str] = None
|
||||
review_cycle_months: int = 12
|
||||
|
||||
|
||||
class CompanyProfileResponse(BaseModel):
|
||||
@@ -84,6 +97,17 @@ class CompanyProfileResponse(BaseModel):
|
||||
completed_at: Optional[str]
|
||||
created_at: str
|
||||
updated_at: str
|
||||
# Phase 2 fields
|
||||
repos: list[dict] = []
|
||||
document_sources: list[dict] = []
|
||||
processing_systems: list[dict] = []
|
||||
ai_systems: list[dict] = []
|
||||
technical_contacts: list[dict] = []
|
||||
subject_to_nis2: bool = False
|
||||
subject_to_ai_act: bool = False
|
||||
subject_to_iso27001: bool = False
|
||||
supervisory_authority: Optional[str] = None
|
||||
review_cycle_months: int = 12
|
||||
|
||||
|
||||
class AuditEntryResponse(BaseModel):
|
||||
@@ -99,6 +123,22 @@ class AuditListResponse(BaseModel):
|
||||
total: int
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# SQL column lists — keep in sync with SELECT/INSERT
|
||||
# =============================================================================
|
||||
|
||||
_BASE_COLUMNS = """id, tenant_id, company_name, legal_form, industry, founded_year,
|
||||
business_model, offerings, company_size, employee_count, annual_revenue,
|
||||
headquarters_country, headquarters_city, has_international_locations,
|
||||
international_countries, target_markets, primary_jurisdiction,
|
||||
is_data_controller, is_data_processor, uses_ai, ai_use_cases,
|
||||
dpo_name, dpo_email, legal_contact_name, legal_contact_email,
|
||||
machine_builder, is_complete, completed_at, created_at, updated_at,
|
||||
repos, document_sources, processing_systems, ai_systems, technical_contacts,
|
||||
subject_to_nis2, subject_to_ai_act, subject_to_iso27001,
|
||||
supervisory_authority, review_cycle_months"""
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# HELPERS
|
||||
# =============================================================================
|
||||
@@ -136,6 +176,17 @@ def row_to_response(row) -> CompanyProfileResponse:
|
||||
completed_at=str(row[27]) if row[27] else None,
|
||||
created_at=str(row[28]),
|
||||
updated_at=str(row[29]),
|
||||
# Phase 2 fields (indices 30-39)
|
||||
repos=row[30] if isinstance(row[30], list) else [],
|
||||
document_sources=row[31] if isinstance(row[31], list) else [],
|
||||
processing_systems=row[32] if isinstance(row[32], list) else [],
|
||||
ai_systems=row[33] if isinstance(row[33], list) else [],
|
||||
technical_contacts=row[34] if isinstance(row[34], list) else [],
|
||||
subject_to_nis2=row[35] or False,
|
||||
subject_to_ai_act=row[36] or False,
|
||||
subject_to_iso27001=row[37] or False,
|
||||
supervisory_authority=row[38],
|
||||
review_cycle_months=row[39] or 12,
|
||||
)
|
||||
|
||||
|
||||
@@ -171,14 +222,7 @@ async def get_company_profile(
|
||||
db = SessionLocal()
|
||||
try:
|
||||
result = db.execute(
|
||||
"""SELECT id, tenant_id, company_name, legal_form, industry, founded_year,
|
||||
business_model, offerings, company_size, employee_count, annual_revenue,
|
||||
headquarters_country, headquarters_city, has_international_locations,
|
||||
international_countries, target_markets, primary_jurisdiction,
|
||||
is_data_controller, is_data_processor, uses_ai, ai_use_cases,
|
||||
dpo_name, dpo_email, legal_contact_name, legal_contact_email,
|
||||
machine_builder, is_complete, completed_at, created_at, updated_at
|
||||
FROM compliance_company_profiles WHERE tenant_id = :tenant_id""",
|
||||
f"SELECT {_BASE_COLUMNS} FROM compliance_company_profiles WHERE tenant_id = :tenant_id",
|
||||
{"tenant_id": tid},
|
||||
)
|
||||
row = result.fetchone()
|
||||
@@ -218,14 +262,21 @@ async def upsert_company_profile(
|
||||
international_countries, target_markets, primary_jurisdiction,
|
||||
is_data_controller, is_data_processor, uses_ai, ai_use_cases,
|
||||
dpo_name, dpo_email, legal_contact_name, legal_contact_email,
|
||||
machine_builder, is_complete)
|
||||
machine_builder, is_complete,
|
||||
repos, document_sources, processing_systems, ai_systems, technical_contacts,
|
||||
subject_to_nis2, subject_to_ai_act, subject_to_iso27001,
|
||||
supervisory_authority, review_cycle_months)
|
||||
VALUES (:tid, :company_name, :legal_form, :industry, :founded_year,
|
||||
:business_model, :offerings::jsonb, :company_size, :employee_count, :annual_revenue,
|
||||
:hq_country, :hq_city, :has_intl, :intl_countries::jsonb,
|
||||
:target_markets::jsonb, :jurisdiction,
|
||||
:is_controller, :is_processor, :uses_ai, :ai_use_cases::jsonb,
|
||||
:dpo_name, :dpo_email, :legal_name, :legal_email,
|
||||
:machine_builder::jsonb, :is_complete)
|
||||
:machine_builder::jsonb, :is_complete,
|
||||
:repos::jsonb, :document_sources::jsonb, :processing_systems::jsonb,
|
||||
:ai_systems::jsonb, :technical_contacts::jsonb,
|
||||
:subject_to_nis2, :subject_to_ai_act, :subject_to_iso27001,
|
||||
:supervisory_authority, :review_cycle_months)
|
||||
ON CONFLICT (tenant_id) DO UPDATE SET
|
||||
company_name = EXCLUDED.company_name,
|
||||
legal_form = EXCLUDED.legal_form,
|
||||
@@ -252,6 +303,16 @@ async def upsert_company_profile(
|
||||
legal_contact_email = EXCLUDED.legal_contact_email,
|
||||
machine_builder = EXCLUDED.machine_builder,
|
||||
is_complete = EXCLUDED.is_complete,
|
||||
repos = EXCLUDED.repos,
|
||||
document_sources = EXCLUDED.document_sources,
|
||||
processing_systems = EXCLUDED.processing_systems,
|
||||
ai_systems = EXCLUDED.ai_systems,
|
||||
technical_contacts = EXCLUDED.technical_contacts,
|
||||
subject_to_nis2 = EXCLUDED.subject_to_nis2,
|
||||
subject_to_ai_act = EXCLUDED.subject_to_ai_act,
|
||||
subject_to_iso27001 = EXCLUDED.subject_to_iso27001,
|
||||
supervisory_authority = EXCLUDED.supervisory_authority,
|
||||
review_cycle_months = EXCLUDED.review_cycle_months,
|
||||
updated_at = NOW()
|
||||
{completed_at_clause}""",
|
||||
{
|
||||
@@ -281,6 +342,16 @@ async def upsert_company_profile(
|
||||
"legal_email": profile.legal_contact_email,
|
||||
"machine_builder": json.dumps(profile.machine_builder) if profile.machine_builder else None,
|
||||
"is_complete": profile.is_complete,
|
||||
"repos": json.dumps(profile.repos),
|
||||
"document_sources": json.dumps(profile.document_sources),
|
||||
"processing_systems": json.dumps(profile.processing_systems),
|
||||
"ai_systems": json.dumps(profile.ai_systems),
|
||||
"technical_contacts": json.dumps(profile.technical_contacts),
|
||||
"subject_to_nis2": profile.subject_to_nis2,
|
||||
"subject_to_ai_act": profile.subject_to_ai_act,
|
||||
"subject_to_iso27001": profile.subject_to_iso27001,
|
||||
"supervisory_authority": profile.supervisory_authority,
|
||||
"review_cycle_months": profile.review_cycle_months,
|
||||
},
|
||||
)
|
||||
|
||||
@@ -291,14 +362,7 @@ async def upsert_company_profile(
|
||||
|
||||
# Fetch and return
|
||||
result = db.execute(
|
||||
"""SELECT id, tenant_id, company_name, legal_form, industry, founded_year,
|
||||
business_model, offerings, company_size, employee_count, annual_revenue,
|
||||
headquarters_country, headquarters_city, has_international_locations,
|
||||
international_countries, target_markets, primary_jurisdiction,
|
||||
is_data_controller, is_data_processor, uses_ai, ai_use_cases,
|
||||
dpo_name, dpo_email, legal_contact_name, legal_contact_email,
|
||||
machine_builder, is_complete, completed_at, created_at, updated_at
|
||||
FROM compliance_company_profiles WHERE tenant_id = :tid""",
|
||||
f"SELECT {_BASE_COLUMNS} FROM compliance_company_profiles WHERE tenant_id = :tid",
|
||||
{"tid": tid},
|
||||
)
|
||||
row = result.fetchone()
|
||||
@@ -347,6 +411,68 @@ async def delete_company_profile(
|
||||
db.close()
|
||||
|
||||
|
||||
@router.get("/template-context")
|
||||
async def get_template_context(
|
||||
tenant_id: str = "default",
|
||||
x_tenant_id: Optional[str] = Header(None, alias="X-Tenant-ID"),
|
||||
):
|
||||
"""Return flat dict for Jinja2 template substitution in document generation."""
|
||||
tid = x_tenant_id or tenant_id
|
||||
db = SessionLocal()
|
||||
try:
|
||||
result = db.execute(
|
||||
f"SELECT {_BASE_COLUMNS} FROM compliance_company_profiles WHERE tenant_id = :tid",
|
||||
{"tid": tid},
|
||||
)
|
||||
row = result.fetchone()
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="Company profile not found — fill Stammdaten first")
|
||||
|
||||
resp = row_to_response(row)
|
||||
# Build flat context dict for templates
|
||||
ctx = {
|
||||
"company_name": resp.company_name,
|
||||
"legal_form": resp.legal_form,
|
||||
"industry": resp.industry,
|
||||
"business_model": resp.business_model,
|
||||
"company_size": resp.company_size,
|
||||
"employee_count": resp.employee_count,
|
||||
"headquarters_country": resp.headquarters_country,
|
||||
"headquarters_city": resp.headquarters_city,
|
||||
"primary_jurisdiction": resp.primary_jurisdiction,
|
||||
"is_data_controller": resp.is_data_controller,
|
||||
"is_data_processor": resp.is_data_processor,
|
||||
"uses_ai": resp.uses_ai,
|
||||
"dpo_name": resp.dpo_name or "",
|
||||
"dpo_email": resp.dpo_email or "",
|
||||
"legal_contact_name": resp.legal_contact_name or "",
|
||||
"legal_contact_email": resp.legal_contact_email or "",
|
||||
"supervisory_authority": resp.supervisory_authority or "",
|
||||
"review_cycle_months": resp.review_cycle_months,
|
||||
"subject_to_nis2": resp.subject_to_nis2,
|
||||
"subject_to_ai_act": resp.subject_to_ai_act,
|
||||
"subject_to_iso27001": resp.subject_to_iso27001,
|
||||
# Lists as-is for iteration in templates
|
||||
"offerings": resp.offerings,
|
||||
"target_markets": resp.target_markets,
|
||||
"international_countries": resp.international_countries,
|
||||
"ai_use_cases": resp.ai_use_cases,
|
||||
"repos": resp.repos,
|
||||
"document_sources": resp.document_sources,
|
||||
"processing_systems": resp.processing_systems,
|
||||
"ai_systems": resp.ai_systems,
|
||||
"technical_contacts": resp.technical_contacts,
|
||||
# Derived helper values
|
||||
"has_ai_systems": len(resp.ai_systems) > 0,
|
||||
"processing_system_count": len(resp.processing_systems),
|
||||
"ai_system_count": len(resp.ai_systems),
|
||||
"is_complete": resp.is_complete,
|
||||
}
|
||||
return ctx
|
||||
finally:
|
||||
db.close()
|
||||
|
||||
|
||||
@router.get("/audit", response_model=AuditListResponse)
|
||||
async def get_audit_log(
|
||||
tenant_id: str = "default",
|
||||
|
||||
@@ -0,0 +1,15 @@
|
||||
"""Document generation templates for compliance documents."""
|
||||
|
||||
from .dsfa_template import generate_dsfa_draft
|
||||
from .vvt_template import generate_vvt_drafts
|
||||
from .loeschfristen_template import generate_loeschfristen_drafts
|
||||
from .tom_template import generate_tom_drafts
|
||||
from .obligation_template import generate_obligation_drafts
|
||||
|
||||
__all__ = [
|
||||
"generate_dsfa_draft",
|
||||
"generate_vvt_drafts",
|
||||
"generate_loeschfristen_drafts",
|
||||
"generate_tom_drafts",
|
||||
"generate_obligation_drafts",
|
||||
]
|
||||
@@ -0,0 +1,82 @@
|
||||
"""DSFA template generator — creates DSFA skeleton from company profile."""
|
||||
|
||||
|
||||
def generate_dsfa_draft(ctx: dict) -> dict:
|
||||
"""Generate a DSFA draft document from template context.
|
||||
|
||||
Args:
|
||||
ctx: Flat dict from company-profile/template-context endpoint
|
||||
|
||||
Returns:
|
||||
Dict with DSFA fields ready for creation
|
||||
"""
|
||||
company = ctx.get("company_name", "Unbekannt")
|
||||
dpo = ctx.get("dpo_name", "")
|
||||
dpo_email = ctx.get("dpo_email", "")
|
||||
|
||||
sections = {
|
||||
"section_1": {
|
||||
"title": "Beschreibung der Verarbeitung",
|
||||
"content": f"Die {company} führt eine Datenschutz-Folgenabschätzung gemäß Art. 35 DSGVO durch.\n\n"
|
||||
f"**Verantwortlicher:** {company}\n"
|
||||
f"**Datenschutzbeauftragter:** {dpo} ({dpo_email})\n"
|
||||
f"**Zuständige Aufsichtsbehörde:** {ctx.get('supervisory_authority', 'Nicht angegeben')}",
|
||||
},
|
||||
"section_2": {
|
||||
"title": "Notwendigkeit und Verhältnismäßigkeit",
|
||||
"content": "Die Verarbeitung ist zur Erreichung des beschriebenen Zwecks erforderlich. "
|
||||
"Alternative, weniger eingriffsintensive Maßnahmen wurden geprüft.",
|
||||
},
|
||||
"section_3": {
|
||||
"title": "Risiken für die Rechte und Freiheiten",
|
||||
"content": _generate_risk_section(ctx),
|
||||
},
|
||||
"section_6": {
|
||||
"title": "Stellungnahme des DSB",
|
||||
"content": f"Der Datenschutzbeauftragte ({dpo}) wurde konsultiert." if dpo else
|
||||
"Ein Datenschutzbeauftragter wurde noch nicht benannt.",
|
||||
},
|
||||
}
|
||||
|
||||
ai_systems = ctx.get("ai_systems", [])
|
||||
involves_ai = len(ai_systems) > 0
|
||||
|
||||
return {
|
||||
"title": f"DSFA — {company}",
|
||||
"description": f"Automatisch generierte Datenschutz-Folgenabschätzung für {company}",
|
||||
"status": "draft",
|
||||
"risk_level": "high" if involves_ai else "medium",
|
||||
"involves_ai": involves_ai,
|
||||
"dpo_name": dpo,
|
||||
"sections": sections,
|
||||
"processing_systems": [s.get("name", "") for s in ctx.get("processing_systems", [])],
|
||||
"ai_systems_summary": [
|
||||
{"name": s.get("name"), "risk": s.get("risk_category", "unknown")}
|
||||
for s in ai_systems
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
def _generate_risk_section(ctx: dict) -> str:
|
||||
lines = ["## Risikoanalyse\n"]
|
||||
|
||||
if ctx.get("has_ai_systems"):
|
||||
lines.append("### KI-Systeme\n")
|
||||
for s in ctx.get("ai_systems", []):
|
||||
risk = s.get("risk_category", "unbekannt")
|
||||
lines.append(f"- **{s.get('name', 'N/A')}**: Zweck: {s.get('purpose', 'N/A')}, "
|
||||
f"Risiko: {risk}, Human Oversight: {'Ja' if s.get('has_human_oversight') else 'Nein'}")
|
||||
lines.append("")
|
||||
|
||||
if ctx.get("subject_to_ai_act"):
|
||||
lines.append("**Hinweis:** Das Unternehmen unterliegt dem EU AI Act. "
|
||||
"KI-spezifische Risiken müssen gemäß der KI-Verordnung bewertet werden.\n")
|
||||
|
||||
if ctx.get("subject_to_nis2"):
|
||||
lines.append("**Hinweis:** NIS2-Richtlinie ist anwendbar. "
|
||||
"Cybersicherheitsrisiken sind zusätzlich zu bewerten.\n")
|
||||
|
||||
if not ctx.get("has_ai_systems") and not ctx.get("subject_to_nis2"):
|
||||
lines.append("Standardrisiken der Datenverarbeitung sind zu bewerten.\n")
|
||||
|
||||
return "\n".join(lines)
|
||||
@@ -0,0 +1,49 @@
|
||||
"""Loeschfristen template generator — creates retention policies per data category."""
|
||||
|
||||
# Standard DSGVO retention periods
|
||||
_STANDARD_PERIODS = {
|
||||
"Bankdaten": {"duration": "10 Jahre", "legal_basis": "§ 257 HGB, § 147 AO"},
|
||||
"Steuer-ID": {"duration": "10 Jahre", "legal_basis": "§ 147 AO"},
|
||||
"Bewerbungsunterlagen": {"duration": "6 Monate", "legal_basis": "§ 15 AGG"},
|
||||
"Lohnabrechnungen": {"duration": "6 Jahre", "legal_basis": "§ 257 HGB"},
|
||||
"Gesundheitsdaten": {"duration": "10 Jahre", "legal_basis": "§ 630f BGB"},
|
||||
"Kundendaten": {"duration": "3 Jahre", "legal_basis": "§ 195 BGB (Verjährung)"},
|
||||
"Vertragsdaten": {"duration": "10 Jahre", "legal_basis": "§ 257 HGB"},
|
||||
"Kommunikationsdaten": {"duration": "6 Monate", "legal_basis": "Art. 5 Abs. 1e DSGVO"},
|
||||
"Zugriffsprotokolle": {"duration": "12 Monate", "legal_basis": "Art. 5 Abs. 1e DSGVO"},
|
||||
"Mitarbeiterdaten": {"duration": "3 Jahre nach Austritt", "legal_basis": "§ 195 BGB"},
|
||||
}
|
||||
|
||||
|
||||
def generate_loeschfristen_drafts(ctx: dict) -> list[dict]:
|
||||
"""Generate retention policy drafts based on processing systems.
|
||||
|
||||
Args:
|
||||
ctx: Flat dict from company-profile/template-context
|
||||
|
||||
Returns:
|
||||
List of Loeschfristen dicts ready for creation
|
||||
"""
|
||||
# Collect all data categories from processing systems
|
||||
all_categories = set()
|
||||
for system in ctx.get("processing_systems", []):
|
||||
for cat in system.get("personal_data_categories", []):
|
||||
all_categories.add(cat)
|
||||
|
||||
policies = []
|
||||
for i, category in enumerate(sorted(all_categories), 1):
|
||||
standard = _STANDARD_PERIODS.get(category, {})
|
||||
policy = {
|
||||
"policy_id": f"LF-AUTO-{i:03d}",
|
||||
"data_category": category,
|
||||
"retention_period": standard.get("duration", "Noch festzulegen"),
|
||||
"legal_basis": standard.get("legal_basis", "Zu prüfen"),
|
||||
"deletion_method": "Automatische Löschung nach Ablauf",
|
||||
"responsible": ctx.get("dpo_name", "DSB"),
|
||||
"status": "draft",
|
||||
"review_cycle_months": ctx.get("review_cycle_months", 12),
|
||||
"notes": f"Automatisch generiert aus Stammdaten. Bitte prüfen und anpassen.",
|
||||
}
|
||||
policies.append(policy)
|
||||
|
||||
return policies
|
||||
@@ -0,0 +1,141 @@
|
||||
"""Obligation template generator — creates standard DSGVO obligations."""
|
||||
|
||||
_DSGVO_OBLIGATIONS = [
|
||||
{
|
||||
"title": "Verzeichnis der Verarbeitungstätigkeiten führen",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 30",
|
||||
"description": "Das Verzeichnis muss alle Verarbeitungstätigkeiten mit personenbezogenen Daten dokumentieren.",
|
||||
"category": "documentation",
|
||||
"priority": "high",
|
||||
},
|
||||
{
|
||||
"title": "Datenschutz-Folgenabschätzung durchführen",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 35",
|
||||
"description": "Für Verarbeitungen mit hohem Risiko muss eine DSFA vor Beginn der Verarbeitung durchgeführt werden.",
|
||||
"category": "risk_assessment",
|
||||
"priority": "high",
|
||||
},
|
||||
{
|
||||
"title": "Technisch-organisatorische Maßnahmen implementieren",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 32",
|
||||
"description": "Angemessene TOMs zum Schutz personenbezogener Daten unter Berücksichtigung des Stands der Technik.",
|
||||
"category": "technical",
|
||||
"priority": "high",
|
||||
},
|
||||
{
|
||||
"title": "Betroffenenrechte sicherstellen",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 12-22",
|
||||
"description": "Auskunft, Berichtigung, Löschung, Datenportabilität, Widerspruch — alle Rechte müssen binnen eines Monats erfüllt werden.",
|
||||
"category": "data_subject_rights",
|
||||
"priority": "high",
|
||||
},
|
||||
{
|
||||
"title": "Datenschutzverletzungen melden",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 33-34",
|
||||
"description": "Meldung an die Aufsichtsbehörde binnen 72 Stunden; Benachrichtigung Betroffener bei hohem Risiko.",
|
||||
"category": "incident_response",
|
||||
"priority": "critical",
|
||||
},
|
||||
{
|
||||
"title": "Auftragsverarbeitungsverträge abschließen",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 28",
|
||||
"description": "Schriftliche AV-Verträge mit allen Dienstleistern, die personenbezogene Daten verarbeiten.",
|
||||
"category": "vendor_management",
|
||||
"priority": "high",
|
||||
},
|
||||
{
|
||||
"title": "Datenschutzbeauftragten benennen",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 37",
|
||||
"description": "Pflicht bei ≥20 Mitarbeitern in der automatisierten Verarbeitung oder bei Verarbeitung besonderer Kategorien.",
|
||||
"category": "governance",
|
||||
"priority": "medium",
|
||||
},
|
||||
{
|
||||
"title": "Löschkonzept implementieren",
|
||||
"regulation": "DSGVO",
|
||||
"article": "Art. 17",
|
||||
"description": "Recht auf Löschung — systematisches Löschkonzept mit definierten Fristen pro Datenkategorie.",
|
||||
"category": "data_lifecycle",
|
||||
"priority": "high",
|
||||
},
|
||||
]
|
||||
|
||||
_AI_ACT_OBLIGATIONS = [
|
||||
{
|
||||
"title": "KI-System-Register führen",
|
||||
"regulation": "EU AI Act",
|
||||
"article": "Art. 49",
|
||||
"description": "Alle KI-Systeme müssen in der EU-Datenbank registriert werden (Hochrisiko).",
|
||||
"category": "documentation",
|
||||
"priority": "high",
|
||||
},
|
||||
{
|
||||
"title": "KI-Risikomanagement einrichten",
|
||||
"regulation": "EU AI Act",
|
||||
"article": "Art. 9",
|
||||
"description": "Risikomanagementsystem für den gesamten Lebenszyklus von Hochrisiko-KI-Systemen.",
|
||||
"category": "risk_assessment",
|
||||
"priority": "critical",
|
||||
},
|
||||
{
|
||||
"title": "KI-Transparenzpflichten erfüllen",
|
||||
"regulation": "EU AI Act",
|
||||
"article": "Art. 52",
|
||||
"description": "Nutzer müssen über die Interaktion mit KI-Systemen informiert werden.",
|
||||
"category": "transparency",
|
||||
"priority": "high",
|
||||
},
|
||||
]
|
||||
|
||||
_NIS2_OBLIGATIONS = [
|
||||
{
|
||||
"title": "Cybersicherheits-Risikomanagement",
|
||||
"regulation": "NIS2",
|
||||
"article": "Art. 21",
|
||||
"description": "Angemessene und verhältnismäßige technische, betriebliche und organisatorische Maßnahmen.",
|
||||
"category": "cybersecurity",
|
||||
"priority": "critical",
|
||||
},
|
||||
{
|
||||
"title": "Meldepflichten NIS2",
|
||||
"regulation": "NIS2",
|
||||
"article": "Art. 23",
|
||||
"description": "Frühwarnung binnen 24h, Vorfallmeldung binnen 72h, Abschlussbericht binnen 1 Monat.",
|
||||
"category": "incident_response",
|
||||
"priority": "critical",
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
def generate_obligation_drafts(ctx: dict) -> list[dict]:
|
||||
"""Generate obligation drafts based on regulatory flags.
|
||||
|
||||
Args:
|
||||
ctx: Flat dict from company-profile/template-context
|
||||
|
||||
Returns:
|
||||
List of obligation dicts ready for creation
|
||||
"""
|
||||
obligations = list(_DSGVO_OBLIGATIONS)
|
||||
|
||||
if ctx.get("subject_to_ai_act") or ctx.get("has_ai_systems"):
|
||||
obligations.extend(_AI_ACT_OBLIGATIONS)
|
||||
|
||||
if ctx.get("subject_to_nis2"):
|
||||
obligations.extend(_NIS2_OBLIGATIONS)
|
||||
|
||||
# Enrich with company context
|
||||
for i, o in enumerate(obligations, 1):
|
||||
o["obligation_id"] = f"OBL-AUTO-{i:03d}"
|
||||
o["status"] = "open"
|
||||
o["responsible"] = ctx.get("dpo_name", "")
|
||||
o["review_cycle_months"] = ctx.get("review_cycle_months", 12)
|
||||
|
||||
return obligations
|
||||
@@ -0,0 +1,69 @@
|
||||
"""TOM template generator — creates TOM checklist based on regulatory flags."""
|
||||
|
||||
_BASE_TOMS = [
|
||||
{"category": "Zutrittskontrolle", "name": "Physische Zugangskontrollen", "description": "Schlüssel, Kartenleser, Videoüberwachung"},
|
||||
{"category": "Zugangskontrolle", "name": "Authentifizierung", "description": "Passwortrichtlinien, MFA, SSO"},
|
||||
{"category": "Zugriffskontrolle", "name": "Berechtigungskonzept", "description": "RBAC, Least Privilege, regelmäßige Reviews"},
|
||||
{"category": "Weitergabekontrolle", "name": "Verschlüsselung im Transit", "description": "TLS 1.3 für alle Verbindungen"},
|
||||
{"category": "Eingabekontrolle", "name": "Audit-Logging", "description": "Protokollierung aller Datenänderungen"},
|
||||
{"category": "Auftragskontrolle", "name": "AV-Verträge", "description": "Art. 28 DSGVO Auftragsverarbeitungsverträge"},
|
||||
{"category": "Verfügbarkeitskontrolle", "name": "Backup & Recovery", "description": "Regelmäßige Backups, Disaster Recovery Plan"},
|
||||
{"category": "Trennungsgebot", "name": "Mandantentrennung", "description": "Logische Datentrennung nach Mandanten"},
|
||||
]
|
||||
|
||||
_NIS2_TOMS = [
|
||||
{"category": "Cybersicherheit", "name": "Incident Response Plan", "description": "NIS2-konformer Vorfallreaktionsplan (72h Meldepflicht)"},
|
||||
{"category": "Cybersicherheit", "name": "Supply Chain Security", "description": "Bewertung der Lieferkettensicherheit"},
|
||||
{"category": "Cybersicherheit", "name": "Vulnerability Management", "description": "Regelmäßige Schwachstellenscans und Patch-Management"},
|
||||
]
|
||||
|
||||
_ISO27001_TOMS = [
|
||||
{"category": "ISMS", "name": "Risikomanagement", "description": "ISO 27001 Anhang A — Informationssicherheits-Risikobewertung"},
|
||||
{"category": "ISMS", "name": "Dokumentenlenkung", "description": "Versionierte Sicherheitsrichtlinien und -verfahren"},
|
||||
{"category": "ISMS", "name": "Management Review", "description": "Jährliche Überprüfung des ISMS durch die Geschäftsleitung"},
|
||||
]
|
||||
|
||||
_AI_ACT_TOMS = [
|
||||
{"category": "KI-Compliance", "name": "KI-Risikoklassifizierung", "description": "Bewertung aller KI-Systeme nach EU AI Act Risikokategorien"},
|
||||
{"category": "KI-Compliance", "name": "Human Oversight", "description": "Menschliche Aufsicht für Hochrisiko-KI-Systeme sicherstellen"},
|
||||
{"category": "KI-Compliance", "name": "KI-Transparenz", "description": "Transparenzpflichten bei KI-Einsatz gegenüber Betroffenen"},
|
||||
]
|
||||
|
||||
|
||||
def generate_tom_drafts(ctx: dict) -> list[dict]:
|
||||
"""Generate TOM measure drafts based on regulatory flags.
|
||||
|
||||
Args:
|
||||
ctx: Flat dict from company-profile/template-context
|
||||
|
||||
Returns:
|
||||
List of TOM measure dicts ready for creation
|
||||
"""
|
||||
measures = list(_BASE_TOMS)
|
||||
|
||||
if ctx.get("subject_to_nis2"):
|
||||
measures.extend(_NIS2_TOMS)
|
||||
|
||||
if ctx.get("subject_to_iso27001"):
|
||||
measures.extend(_ISO27001_TOMS)
|
||||
|
||||
if ctx.get("subject_to_ai_act") or ctx.get("has_ai_systems"):
|
||||
measures.extend(_AI_ACT_TOMS)
|
||||
|
||||
# Enrich with metadata
|
||||
company = ctx.get("company_name", "")
|
||||
result = []
|
||||
for i, m in enumerate(measures, 1):
|
||||
result.append({
|
||||
"control_id": f"TOM-AUTO-{i:03d}",
|
||||
"name": m["name"],
|
||||
"description": m["description"],
|
||||
"category": m["category"],
|
||||
"type": "technical" if m["category"] in ("Zugangskontrolle", "Zugriffskontrolle", "Weitergabekontrolle", "Cybersicherheit") else "organizational",
|
||||
"implementation_status": "planned",
|
||||
"responsible_department": "IT-Sicherheit",
|
||||
"priority": "high" if "KI" in m.get("category", "") or "Cyber" in m.get("category", "") else "medium",
|
||||
"review_frequency": f"{ctx.get('review_cycle_months', 12)} Monate",
|
||||
})
|
||||
|
||||
return result
|
||||
@@ -0,0 +1,53 @@
|
||||
"""VVT template generator — creates VVT activity drafts per processing system."""
|
||||
|
||||
|
||||
def generate_vvt_drafts(ctx: dict) -> list[dict]:
|
||||
"""Generate VVT activity drafts, one per processing system.
|
||||
|
||||
Args:
|
||||
ctx: Flat dict from company-profile/template-context
|
||||
|
||||
Returns:
|
||||
List of VVT activity dicts ready for creation
|
||||
"""
|
||||
systems = ctx.get("processing_systems", [])
|
||||
company = ctx.get("company_name", "Unbekannt")
|
||||
dpo = ctx.get("dpo_name", "")
|
||||
activities = []
|
||||
|
||||
for i, system in enumerate(systems, 1):
|
||||
name = system.get("name", f"System {i}")
|
||||
vendor = system.get("vendor", "")
|
||||
hosting = system.get("hosting", "on-premise")
|
||||
categories = system.get("personal_data_categories", [])
|
||||
|
||||
activity = {
|
||||
"vvt_id": f"VVT-AUTO-{i:03d}",
|
||||
"name": f"Verarbeitung in {name}",
|
||||
"description": f"Automatisch generierter VVT-Eintrag für das System '{name}'"
|
||||
+ (f" (Anbieter: {vendor})" if vendor else ""),
|
||||
"purposes": [f"Datenverarbeitung via {name}"],
|
||||
"legal_bases": ["Art. 6 Abs. 1b DSGVO — Vertragserfüllung"],
|
||||
"data_subject_categories": [],
|
||||
"personal_data_categories": categories,
|
||||
"recipient_categories": [vendor] if vendor else [],
|
||||
"third_country_transfers": _assess_third_country(hosting),
|
||||
"retention_period": {"default": "Gemäß Löschfristenkatalog"},
|
||||
"tom_description": f"Siehe TOM-Katalog für {name}",
|
||||
"business_function": "IT",
|
||||
"systems": [name],
|
||||
"deployment_model": hosting,
|
||||
"protection_level": "HIGH" if categories else "MEDIUM",
|
||||
"dpia_required": len(categories) > 3,
|
||||
"status": "DRAFT",
|
||||
"responsible": dpo or company,
|
||||
}
|
||||
activities.append(activity)
|
||||
|
||||
return activities
|
||||
|
||||
|
||||
def _assess_third_country(hosting: str) -> list:
|
||||
if hosting in ("us-cloud", "international"):
|
||||
return [{"country": "USA", "mechanism": "EU-US Data Privacy Framework"}]
|
||||
return []
|
||||
@@ -33,7 +33,11 @@ from classroom_engine.database import get_db
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/dsfa", tags=["compliance-dsfa"])
|
||||
|
||||
DEFAULT_TENANT_ID = "default"
|
||||
from .tenant_utils import get_tenant_id as _shared_get_tenant_id
|
||||
|
||||
# Legacy compat — still used by _get_tenant_id() below; will be removed once
|
||||
# all call-sites switch to Depends(get_tenant_id).
|
||||
DEFAULT_TENANT_ID = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
||||
|
||||
VALID_STATUSES = {"draft", "in-review", "approved", "needs-update"}
|
||||
VALID_RISK_LEVELS = {"low", "medium", "high", "critical"}
|
||||
@@ -909,3 +913,33 @@ async def export_dsfa_json(
|
||||
}
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Versioning
|
||||
# =============================================================================
|
||||
|
||||
@router.get("/{dsfa_id}/versions")
|
||||
async def list_dsfa_versions(
|
||||
dsfa_id: str,
|
||||
tenant_id: Optional[str] = Query(None),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""List all versions for a DSFA."""
|
||||
from .versioning_utils import list_versions
|
||||
tid = _get_tenant_id(tenant_id)
|
||||
return list_versions(db, "dsfa", dsfa_id, tid)
|
||||
|
||||
|
||||
@router.get("/{dsfa_id}/versions/{version_number}")
|
||||
async def get_dsfa_version(
|
||||
dsfa_id: str,
|
||||
version_number: int,
|
||||
tenant_id: Optional[str] = Query(None),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get a specific DSFA version with full snapshot."""
|
||||
from .versioning_utils import get_version
|
||||
tid = _get_tenant_id(tenant_id)
|
||||
v = get_version(db, "dsfa", dsfa_id, version_number, tid)
|
||||
if not v:
|
||||
raise HTTPException(status_code=404, detail=f"Version {version_number} not found")
|
||||
return v
|
||||
|
||||
186
backend-compliance/compliance/api/generation_routes.py
Normal file
186
backend-compliance/compliance/api/generation_routes.py
Normal file
@@ -0,0 +1,186 @@
|
||||
"""
|
||||
FastAPI routes for Document Generation from Stammdaten.
|
||||
|
||||
Endpoints:
|
||||
GET /generation/preview/{doc_type} — Markdown preview from Stammdaten
|
||||
POST /generation/apply/{doc_type} — Generate drafts → create Change-Requests
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Header
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from classroom_engine.database import get_db
|
||||
from .tenant_utils import get_tenant_id
|
||||
from .document_templates import (
|
||||
generate_dsfa_draft,
|
||||
generate_vvt_drafts,
|
||||
generate_loeschfristen_drafts,
|
||||
generate_tom_drafts,
|
||||
generate_obligation_drafts,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/generation", tags=["generation"])
|
||||
|
||||
VALID_DOC_TYPES = {"dsfa", "vvt", "tom", "loeschfristen", "obligation"}
|
||||
|
||||
|
||||
def _get_template_context(db, tid: str) -> dict:
|
||||
"""Fetch company profile and build template context."""
|
||||
from .company_profile_routes import _BASE_COLUMNS, row_to_response
|
||||
from database import SessionLocal
|
||||
|
||||
# Use a fresh session for company_profile (different DB import pattern)
|
||||
cp_db = SessionLocal()
|
||||
try:
|
||||
result = cp_db.execute(
|
||||
f"SELECT {_BASE_COLUMNS} FROM compliance_company_profiles WHERE tenant_id = :tid",
|
||||
{"tid": tid},
|
||||
)
|
||||
row = result.fetchone()
|
||||
if not row:
|
||||
return None
|
||||
resp = row_to_response(row)
|
||||
# Build flat context
|
||||
return {
|
||||
"company_name": resp.company_name,
|
||||
"legal_form": resp.legal_form,
|
||||
"industry": resp.industry,
|
||||
"business_model": resp.business_model,
|
||||
"company_size": resp.company_size,
|
||||
"employee_count": resp.employee_count,
|
||||
"headquarters_country": resp.headquarters_country,
|
||||
"headquarters_city": resp.headquarters_city,
|
||||
"primary_jurisdiction": resp.primary_jurisdiction,
|
||||
"is_data_controller": resp.is_data_controller,
|
||||
"is_data_processor": resp.is_data_processor,
|
||||
"uses_ai": resp.uses_ai,
|
||||
"dpo_name": resp.dpo_name or "",
|
||||
"dpo_email": resp.dpo_email or "",
|
||||
"supervisory_authority": resp.supervisory_authority or "",
|
||||
"review_cycle_months": resp.review_cycle_months,
|
||||
"subject_to_nis2": resp.subject_to_nis2,
|
||||
"subject_to_ai_act": resp.subject_to_ai_act,
|
||||
"subject_to_iso27001": resp.subject_to_iso27001,
|
||||
"offerings": resp.offerings,
|
||||
"target_markets": resp.target_markets,
|
||||
"ai_use_cases": resp.ai_use_cases,
|
||||
"repos": resp.repos,
|
||||
"document_sources": resp.document_sources,
|
||||
"processing_systems": resp.processing_systems,
|
||||
"ai_systems": resp.ai_systems,
|
||||
"technical_contacts": resp.technical_contacts,
|
||||
"has_ai_systems": len(resp.ai_systems) > 0,
|
||||
"processing_system_count": len(resp.processing_systems),
|
||||
"ai_system_count": len(resp.ai_systems),
|
||||
"is_complete": resp.is_complete,
|
||||
}
|
||||
finally:
|
||||
cp_db.close()
|
||||
|
||||
|
||||
def _generate_for_type(doc_type: str, ctx: dict):
|
||||
"""Call the appropriate template generator."""
|
||||
if doc_type == "dsfa":
|
||||
return [generate_dsfa_draft(ctx)]
|
||||
elif doc_type == "vvt":
|
||||
return generate_vvt_drafts(ctx)
|
||||
elif doc_type == "tom":
|
||||
return generate_tom_drafts(ctx)
|
||||
elif doc_type == "loeschfristen":
|
||||
return generate_loeschfristen_drafts(ctx)
|
||||
elif doc_type == "obligation":
|
||||
return generate_obligation_drafts(ctx)
|
||||
else:
|
||||
raise ValueError(f"Unknown doc_type: {doc_type}")
|
||||
|
||||
|
||||
@router.get("/preview/{doc_type}")
|
||||
async def preview_generation(
|
||||
doc_type: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Preview what documents would be generated (no DB writes)."""
|
||||
if doc_type not in VALID_DOC_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid doc_type: {doc_type}. Valid: {VALID_DOC_TYPES}")
|
||||
|
||||
ctx = _get_template_context(db, tid)
|
||||
if not ctx:
|
||||
raise HTTPException(status_code=404, detail="Company profile not found — fill Stammdaten first")
|
||||
|
||||
drafts = _generate_for_type(doc_type, ctx)
|
||||
|
||||
return {
|
||||
"doc_type": doc_type,
|
||||
"count": len(drafts),
|
||||
"drafts": drafts,
|
||||
"company_name": ctx.get("company_name"),
|
||||
"is_preview": True,
|
||||
}
|
||||
|
||||
|
||||
@router.post("/apply/{doc_type}")
|
||||
async def apply_generation(
|
||||
doc_type: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
x_user_id: Optional[str] = Header(None, alias="X-User-ID"),
|
||||
):
|
||||
"""Generate drafts and create Change-Requests for each.
|
||||
|
||||
Does NOT create documents directly — all go through the CR inbox.
|
||||
"""
|
||||
if doc_type not in VALID_DOC_TYPES:
|
||||
raise HTTPException(status_code=400, detail=f"Invalid doc_type: {doc_type}. Valid: {VALID_DOC_TYPES}")
|
||||
|
||||
ctx = _get_template_context(db, tid)
|
||||
if not ctx:
|
||||
raise HTTPException(status_code=404, detail="Company profile not found — fill Stammdaten first")
|
||||
|
||||
drafts = _generate_for_type(doc_type, ctx)
|
||||
user = x_user_id or "system"
|
||||
|
||||
cr_ids = []
|
||||
for draft in drafts:
|
||||
title = draft.get("title") or draft.get("name") or draft.get("data_category") or f"Neues {doc_type}-Dokument"
|
||||
try:
|
||||
result = db.execute(
|
||||
text("""
|
||||
INSERT INTO compliance_change_requests
|
||||
(tenant_id, trigger_type, target_document_type,
|
||||
proposal_title, proposal_body, proposed_changes,
|
||||
priority, created_by)
|
||||
VALUES (:tid, 'generation', :doc_type,
|
||||
:title, :body, CAST(:changes AS jsonb),
|
||||
'normal', :user)
|
||||
RETURNING id
|
||||
"""),
|
||||
{
|
||||
"tid": tid,
|
||||
"doc_type": doc_type,
|
||||
"title": f"[Generiert] {title}",
|
||||
"body": f"Automatisch aus Stammdaten generiert für {ctx.get('company_name', '')}",
|
||||
"changes": json.dumps(draft),
|
||||
"user": user,
|
||||
},
|
||||
)
|
||||
row = result.fetchone()
|
||||
if row:
|
||||
cr_ids.append(str(row[0]))
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create CR for draft: {e}")
|
||||
|
||||
db.commit()
|
||||
|
||||
return {
|
||||
"doc_type": doc_type,
|
||||
"drafts_generated": len(drafts),
|
||||
"change_requests_created": len(cr_ids),
|
||||
"change_request_ids": cr_ids,
|
||||
}
|
||||
@@ -352,3 +352,35 @@ async def delete_loeschfrist(
|
||||
db.commit()
|
||||
if result.rowcount == 0:
|
||||
raise HTTPException(status_code=404, detail="Loeschfrist not found")
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Versioning
|
||||
# =============================================================================
|
||||
|
||||
@router.get("/{policy_id}/versions")
|
||||
async def list_loeschfristen_versions(
|
||||
policy_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
x_tenant_id: Optional[str] = Header(None),
|
||||
):
|
||||
"""List all versions for a Loeschfrist."""
|
||||
from .versioning_utils import list_versions
|
||||
tenant_id = _get_tenant_id(x_tenant_id)
|
||||
return list_versions(db, "loeschfristen", policy_id, tenant_id)
|
||||
|
||||
|
||||
@router.get("/{policy_id}/versions/{version_number}")
|
||||
async def get_loeschfristen_version(
|
||||
policy_id: str,
|
||||
version_number: int,
|
||||
db: Session = Depends(get_db),
|
||||
x_tenant_id: Optional[str] = Header(None),
|
||||
):
|
||||
"""Get a specific Loeschfristen version with full snapshot."""
|
||||
from .versioning_utils import get_version
|
||||
tenant_id = _get_tenant_id(x_tenant_id)
|
||||
v = get_version(db, "loeschfristen", policy_id, version_number, tenant_id)
|
||||
if not v:
|
||||
raise HTTPException(status_code=404, detail=f"Version {version_number} not found")
|
||||
return v
|
||||
|
||||
@@ -324,3 +324,35 @@ async def delete_obligation(
|
||||
db.commit()
|
||||
if result.rowcount == 0:
|
||||
raise HTTPException(status_code=404, detail="Obligation not found")
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Versioning
|
||||
# =============================================================================
|
||||
|
||||
@router.get("/{obligation_id}/versions")
|
||||
async def list_obligation_versions(
|
||||
obligation_id: str,
|
||||
db: Session = Depends(get_db),
|
||||
x_tenant_id: Optional[str] = Header(None),
|
||||
):
|
||||
"""List all versions for an Obligation."""
|
||||
from .versioning_utils import list_versions
|
||||
tenant_id = _get_tenant_id(x_tenant_id)
|
||||
return list_versions(db, "obligation", obligation_id, tenant_id)
|
||||
|
||||
|
||||
@router.get("/{obligation_id}/versions/{version_number}")
|
||||
async def get_obligation_version(
|
||||
obligation_id: str,
|
||||
version_number: int,
|
||||
db: Session = Depends(get_db),
|
||||
x_tenant_id: Optional[str] = Header(None),
|
||||
):
|
||||
"""Get a specific Obligation version with full snapshot."""
|
||||
from .versioning_utils import get_version
|
||||
tenant_id = _get_tenant_id(x_tenant_id)
|
||||
v = get_version(db, "obligation", obligation_id, version_number, tenant_id)
|
||||
if not v:
|
||||
raise HTTPException(status_code=404, detail=f"Version {version_number} not found")
|
||||
return v
|
||||
|
||||
58
backend-compliance/compliance/api/tenant_utils.py
Normal file
58
backend-compliance/compliance/api/tenant_utils.py
Normal file
@@ -0,0 +1,58 @@
|
||||
"""
|
||||
Shared tenant middleware for all Compliance API routes.
|
||||
|
||||
Provides a central FastAPI dependency that resolves tenant_id from:
|
||||
1. X-Tenant-ID header (primary)
|
||||
2. Query parameter tenant_id (fallback)
|
||||
3. Environment variable DEFAULT_TENANT_ID (last resort)
|
||||
|
||||
UUID validation ensures no more "default" strings leak through.
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
from fastapi import Header, Query, HTTPException
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Fallback for local development — real deployments must pass X-Tenant-ID
|
||||
_ENV_DEFAULT = os.getenv(
|
||||
"DEFAULT_TENANT_ID", "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
||||
)
|
||||
|
||||
_UUID_RE = re.compile(
|
||||
r"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$", re.I
|
||||
)
|
||||
|
||||
|
||||
def _validate_tenant_id(tid: str) -> str:
|
||||
"""Validate that tenant_id looks like a UUID. Reject 'default' etc."""
|
||||
if tid == "default":
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail="Tenant ID 'default' is no longer accepted. Pass a valid UUID.",
|
||||
)
|
||||
if not _UUID_RE.match(tid):
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"Invalid tenant_id format: '{tid}'. Must be a UUID.",
|
||||
)
|
||||
return tid
|
||||
|
||||
|
||||
async def get_tenant_id(
|
||||
x_tenant_id: Optional[str] = Header(None, alias="X-Tenant-ID"),
|
||||
tenant_id: Optional[str] = Query(None),
|
||||
) -> str:
|
||||
"""FastAPI dependency — resolves + validates tenant ID.
|
||||
|
||||
Usage:
|
||||
@router.get("/something")
|
||||
async def my_endpoint(tid: str = Depends(get_tenant_id)):
|
||||
...
|
||||
"""
|
||||
raw = x_tenant_id or tenant_id or _ENV_DEFAULT
|
||||
return _validate_tenant_id(raw)
|
||||
@@ -573,3 +573,37 @@ async def export_measures(
|
||||
media_type="text/csv; charset=utf-8",
|
||||
headers={"Content-Disposition": "attachment; filename=tom_export.csv"},
|
||||
)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Versioning
|
||||
# =============================================================================
|
||||
|
||||
@router.get("/measures/{measure_id}/versions")
|
||||
async def list_measure_versions(
|
||||
measure_id: str,
|
||||
tenant_id: Optional[str] = Query(None, alias="tenant_id"),
|
||||
tenantId: Optional[str] = Query(None, alias="tenantId"),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""List all versions for a TOM measure."""
|
||||
from .versioning_utils import list_versions
|
||||
tid = tenant_id or tenantId or DEFAULT_TENANT_ID
|
||||
return list_versions(db, "tom", measure_id, tid)
|
||||
|
||||
|
||||
@router.get("/measures/{measure_id}/versions/{version_number}")
|
||||
async def get_measure_version(
|
||||
measure_id: str,
|
||||
version_number: int,
|
||||
tenant_id: Optional[str] = Query(None, alias="tenant_id"),
|
||||
tenantId: Optional[str] = Query(None, alias="tenantId"),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get a specific TOM measure version with full snapshot."""
|
||||
from .versioning_utils import get_version
|
||||
tid = tenant_id or tenantId or DEFAULT_TENANT_ID
|
||||
v = get_version(db, "tom", measure_id, version_number, tid)
|
||||
if not v:
|
||||
raise HTTPException(status_code=404, detail=f"Version {version_number} not found")
|
||||
return v
|
||||
|
||||
@@ -62,7 +62,8 @@ from classroom_engine.database import get_db
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/vendor-compliance", tags=["vendor-compliance"])
|
||||
|
||||
DEFAULT_TENANT_ID = "default"
|
||||
# Default tenant UUID — "default" string no longer accepted
|
||||
DEFAULT_TENANT_ID = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
||||
|
||||
# =============================================================================
|
||||
# Helpers
|
||||
|
||||
175
backend-compliance/compliance/api/versioning_utils.py
Normal file
175
backend-compliance/compliance/api/versioning_utils.py
Normal file
@@ -0,0 +1,175 @@
|
||||
"""
|
||||
Shared versioning utilities for all compliance document types.
|
||||
|
||||
Provides create_version_snapshot() and list_versions() helpers that work
|
||||
with all 5 version tables (DSFA, VVT, TOM, Loeschfristen, Obligations).
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Optional, List
|
||||
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Table → FK column mapping
|
||||
VERSION_TABLES = {
|
||||
"dsfa": ("compliance_dsfa_versions", "dsfa_id", "compliance_dsfas"),
|
||||
"vvt_activity": ("compliance_vvt_activity_versions", "activity_id", "compliance_vvt_activities"),
|
||||
"tom": ("compliance_tom_versions", "measure_id", "compliance_tom_measures"),
|
||||
"loeschfristen": ("compliance_loeschfristen_versions", "policy_id", "compliance_loeschfristen"),
|
||||
"obligation": ("compliance_obligation_versions", "obligation_id", "compliance_obligations"),
|
||||
}
|
||||
|
||||
|
||||
def create_version_snapshot(
|
||||
db: Session,
|
||||
doc_type: str,
|
||||
doc_id: str,
|
||||
tenant_id: str,
|
||||
snapshot: dict,
|
||||
change_summary: str = "",
|
||||
changed_sections: list = None,
|
||||
created_by: str = "system",
|
||||
) -> dict:
|
||||
"""Create a new version snapshot for any document type.
|
||||
|
||||
Args:
|
||||
doc_type: One of "dsfa", "vvt_activity", "tom", "loeschfristen", "obligation"
|
||||
doc_id: UUID of the source document
|
||||
tenant_id: Tenant UUID
|
||||
snapshot: Full JSONB snapshot of the document state
|
||||
change_summary: Human-readable summary of changes
|
||||
changed_sections: List of section identifiers that changed
|
||||
created_by: User who created this version
|
||||
|
||||
Returns:
|
||||
Dict with version info (id, version_number, created_at)
|
||||
"""
|
||||
if doc_type not in VERSION_TABLES:
|
||||
raise ValueError(f"Unknown document type: {doc_type}")
|
||||
|
||||
version_table, fk_column, source_table = VERSION_TABLES[doc_type]
|
||||
|
||||
# Get next version number
|
||||
result = db.execute(
|
||||
text(f"SELECT COALESCE(MAX(version_number), 0) FROM {version_table} WHERE {fk_column} = :doc_id"),
|
||||
{"doc_id": doc_id},
|
||||
)
|
||||
next_version = result.scalar() + 1
|
||||
|
||||
# Insert version
|
||||
result = db.execute(
|
||||
text(f"""
|
||||
INSERT INTO {version_table}
|
||||
({fk_column}, tenant_id, version_number, snapshot, change_summary,
|
||||
changed_sections, created_by)
|
||||
VALUES (:doc_id, :tenant_id, :version_number, CAST(:snapshot AS jsonb),
|
||||
:change_summary, CAST(:changed_sections AS jsonb), :created_by)
|
||||
RETURNING id, version_number, created_at
|
||||
"""),
|
||||
{
|
||||
"doc_id": doc_id,
|
||||
"tenant_id": tenant_id,
|
||||
"version_number": next_version,
|
||||
"snapshot": json.dumps(snapshot),
|
||||
"change_summary": change_summary,
|
||||
"changed_sections": json.dumps(changed_sections or []),
|
||||
"created_by": created_by,
|
||||
},
|
||||
)
|
||||
row = result.fetchone()
|
||||
|
||||
# Update current_version on the source table
|
||||
db.execute(
|
||||
text(f"UPDATE {source_table} SET current_version = :v WHERE id = :doc_id"),
|
||||
{"v": next_version, "doc_id": doc_id},
|
||||
)
|
||||
|
||||
return {
|
||||
"id": str(row[0]),
|
||||
"version_number": row[1],
|
||||
"created_at": row[2].isoformat() if row[2] else None,
|
||||
}
|
||||
|
||||
|
||||
def list_versions(
|
||||
db: Session,
|
||||
doc_type: str,
|
||||
doc_id: str,
|
||||
tenant_id: str,
|
||||
) -> List[dict]:
|
||||
"""List all versions for a document, newest first."""
|
||||
if doc_type not in VERSION_TABLES:
|
||||
raise ValueError(f"Unknown document type: {doc_type}")
|
||||
|
||||
version_table, fk_column, _ = VERSION_TABLES[doc_type]
|
||||
|
||||
result = db.execute(
|
||||
text(f"""
|
||||
SELECT id, version_number, status, change_summary, changed_sections,
|
||||
created_by, approved_by, approved_at, created_at
|
||||
FROM {version_table}
|
||||
WHERE {fk_column} = :doc_id AND tenant_id = :tenant_id
|
||||
ORDER BY version_number DESC
|
||||
"""),
|
||||
{"doc_id": doc_id, "tenant_id": tenant_id},
|
||||
)
|
||||
rows = result.fetchall()
|
||||
return [
|
||||
{
|
||||
"id": str(r[0]),
|
||||
"version_number": r[1],
|
||||
"status": r[2],
|
||||
"change_summary": r[3],
|
||||
"changed_sections": r[4] or [],
|
||||
"created_by": r[5],
|
||||
"approved_by": r[6],
|
||||
"approved_at": r[7].isoformat() if r[7] else None,
|
||||
"created_at": r[8].isoformat() if r[8] else None,
|
||||
}
|
||||
for r in rows
|
||||
]
|
||||
|
||||
|
||||
def get_version(
|
||||
db: Session,
|
||||
doc_type: str,
|
||||
doc_id: str,
|
||||
version_number: int,
|
||||
tenant_id: str,
|
||||
) -> Optional[dict]:
|
||||
"""Get a specific version with its full snapshot."""
|
||||
if doc_type not in VERSION_TABLES:
|
||||
raise ValueError(f"Unknown document type: {doc_type}")
|
||||
|
||||
version_table, fk_column, _ = VERSION_TABLES[doc_type]
|
||||
|
||||
result = db.execute(
|
||||
text(f"""
|
||||
SELECT id, version_number, status, snapshot, change_summary,
|
||||
changed_sections, created_by, approved_by, approved_at, created_at
|
||||
FROM {version_table}
|
||||
WHERE {fk_column} = :doc_id AND version_number = :v AND tenant_id = :tenant_id
|
||||
"""),
|
||||
{"doc_id": doc_id, "v": version_number, "tenant_id": tenant_id},
|
||||
)
|
||||
r = result.fetchone()
|
||||
if not r:
|
||||
return None
|
||||
|
||||
return {
|
||||
"id": str(r[0]),
|
||||
"version_number": r[1],
|
||||
"status": r[2],
|
||||
"snapshot": r[3],
|
||||
"change_summary": r[4],
|
||||
"changed_sections": r[5] or [],
|
||||
"created_by": r[6],
|
||||
"approved_by": r[7],
|
||||
"approved_at": r[8].isoformat() if r[8] else None,
|
||||
"created_at": r[9].isoformat() if r[9] else None,
|
||||
}
|
||||
@@ -33,6 +33,7 @@ from .schemas import (
|
||||
VVTActivityCreate, VVTActivityUpdate, VVTActivityResponse,
|
||||
VVTStatsResponse, VVTAuditLogEntry,
|
||||
)
|
||||
from .tenant_utils import get_tenant_id
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter(prefix="/vvt", tags=["compliance-vvt"])
|
||||
@@ -40,6 +41,7 @@ router = APIRouter(prefix="/vvt", tags=["compliance-vvt"])
|
||||
|
||||
def _log_audit(
|
||||
db: Session,
|
||||
tenant_id: str,
|
||||
action: str,
|
||||
entity_type: str,
|
||||
entity_id=None,
|
||||
@@ -48,6 +50,7 @@ def _log_audit(
|
||||
new_values=None,
|
||||
):
|
||||
entry = VVTAuditLogDB(
|
||||
tenant_id=tenant_id,
|
||||
action=action,
|
||||
entity_type=entity_type,
|
||||
entity_id=entity_id,
|
||||
@@ -63,9 +66,17 @@ def _log_audit(
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/organization", response_model=Optional[VVTOrganizationResponse])
|
||||
async def get_organization(db: Session = Depends(get_db)):
|
||||
"""Load the VVT organization header (single record)."""
|
||||
org = db.query(VVTOrganizationDB).order_by(VVTOrganizationDB.created_at).first()
|
||||
async def get_organization(
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Load the VVT organization header for the given tenant."""
|
||||
org = (
|
||||
db.query(VVTOrganizationDB)
|
||||
.filter(VVTOrganizationDB.tenant_id == tid)
|
||||
.order_by(VVTOrganizationDB.created_at)
|
||||
.first()
|
||||
)
|
||||
if not org:
|
||||
return None
|
||||
return VVTOrganizationResponse(
|
||||
@@ -88,15 +99,22 @@ async def get_organization(db: Session = Depends(get_db)):
|
||||
@router.put("/organization", response_model=VVTOrganizationResponse)
|
||||
async def upsert_organization(
|
||||
request: VVTOrganizationUpdate,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Create or update the VVT organization header."""
|
||||
org = db.query(VVTOrganizationDB).order_by(VVTOrganizationDB.created_at).first()
|
||||
org = (
|
||||
db.query(VVTOrganizationDB)
|
||||
.filter(VVTOrganizationDB.tenant_id == tid)
|
||||
.order_by(VVTOrganizationDB.created_at)
|
||||
.first()
|
||||
)
|
||||
|
||||
if not org:
|
||||
data = request.dict(exclude_none=True)
|
||||
if 'organization_name' not in data:
|
||||
data['organization_name'] = 'Meine Organisation'
|
||||
data['tenant_id'] = tid
|
||||
org = VVTOrganizationDB(**data)
|
||||
db.add(org)
|
||||
else:
|
||||
@@ -168,10 +186,11 @@ async def list_activities(
|
||||
business_function: Optional[str] = Query(None),
|
||||
search: Optional[str] = Query(None),
|
||||
review_overdue: Optional[bool] = Query(None),
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""List all processing activities with optional filters."""
|
||||
query = db.query(VVTActivityDB)
|
||||
query = db.query(VVTActivityDB).filter(VVTActivityDB.tenant_id == tid)
|
||||
|
||||
if status:
|
||||
query = query.filter(VVTActivityDB.status == status)
|
||||
@@ -199,12 +218,14 @@ async def list_activities(
|
||||
async def create_activity(
|
||||
request: VVTActivityCreate,
|
||||
http_request: Request,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Create a new processing activity."""
|
||||
# Check for duplicate vvt_id
|
||||
# Check for duplicate vvt_id within tenant
|
||||
existing = db.query(VVTActivityDB).filter(
|
||||
VVTActivityDB.vvt_id == request.vvt_id
|
||||
VVTActivityDB.tenant_id == tid,
|
||||
VVTActivityDB.vvt_id == request.vvt_id,
|
||||
).first()
|
||||
if existing:
|
||||
raise HTTPException(
|
||||
@@ -213,6 +234,7 @@ async def create_activity(
|
||||
)
|
||||
|
||||
data = request.dict()
|
||||
data['tenant_id'] = tid
|
||||
# Set created_by from X-User-ID header if not provided in body
|
||||
if not data.get('created_by'):
|
||||
data['created_by'] = http_request.headers.get('X-User-ID', 'system')
|
||||
@@ -223,6 +245,7 @@ async def create_activity(
|
||||
|
||||
_log_audit(
|
||||
db,
|
||||
tenant_id=tid,
|
||||
action="CREATE",
|
||||
entity_type="activity",
|
||||
entity_id=act.id,
|
||||
@@ -235,9 +258,16 @@ async def create_activity(
|
||||
|
||||
|
||||
@router.get("/activities/{activity_id}", response_model=VVTActivityResponse)
|
||||
async def get_activity(activity_id: str, db: Session = Depends(get_db)):
|
||||
async def get_activity(
|
||||
activity_id: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get a single processing activity by ID."""
|
||||
act = db.query(VVTActivityDB).filter(VVTActivityDB.id == activity_id).first()
|
||||
act = db.query(VVTActivityDB).filter(
|
||||
VVTActivityDB.id == activity_id,
|
||||
VVTActivityDB.tenant_id == tid,
|
||||
).first()
|
||||
if not act:
|
||||
raise HTTPException(status_code=404, detail=f"Activity {activity_id} not found")
|
||||
return _activity_to_response(act)
|
||||
@@ -247,10 +277,14 @@ async def get_activity(activity_id: str, db: Session = Depends(get_db)):
|
||||
async def update_activity(
|
||||
activity_id: str,
|
||||
request: VVTActivityUpdate,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Update a processing activity."""
|
||||
act = db.query(VVTActivityDB).filter(VVTActivityDB.id == activity_id).first()
|
||||
act = db.query(VVTActivityDB).filter(
|
||||
VVTActivityDB.id == activity_id,
|
||||
VVTActivityDB.tenant_id == tid,
|
||||
).first()
|
||||
if not act:
|
||||
raise HTTPException(status_code=404, detail=f"Activity {activity_id} not found")
|
||||
|
||||
@@ -262,6 +296,7 @@ async def update_activity(
|
||||
|
||||
_log_audit(
|
||||
db,
|
||||
tenant_id=tid,
|
||||
action="UPDATE",
|
||||
entity_type="activity",
|
||||
entity_id=act.id,
|
||||
@@ -275,14 +310,22 @@ async def update_activity(
|
||||
|
||||
|
||||
@router.delete("/activities/{activity_id}")
|
||||
async def delete_activity(activity_id: str, db: Session = Depends(get_db)):
|
||||
async def delete_activity(
|
||||
activity_id: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Delete a processing activity."""
|
||||
act = db.query(VVTActivityDB).filter(VVTActivityDB.id == activity_id).first()
|
||||
act = db.query(VVTActivityDB).filter(
|
||||
VVTActivityDB.id == activity_id,
|
||||
VVTActivityDB.tenant_id == tid,
|
||||
).first()
|
||||
if not act:
|
||||
raise HTTPException(status_code=404, detail=f"Activity {activity_id} not found")
|
||||
|
||||
_log_audit(
|
||||
db,
|
||||
tenant_id=tid,
|
||||
action="DELETE",
|
||||
entity_type="activity",
|
||||
entity_id=act.id,
|
||||
@@ -302,11 +345,13 @@ async def delete_activity(activity_id: str, db: Session = Depends(get_db)):
|
||||
async def get_audit_log(
|
||||
limit: int = Query(50, ge=1, le=500),
|
||||
offset: int = Query(0, ge=0),
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get the VVT audit trail."""
|
||||
entries = (
|
||||
db.query(VVTAuditLogDB)
|
||||
.filter(VVTAuditLogDB.tenant_id == tid)
|
||||
.order_by(VVTAuditLogDB.created_at.desc())
|
||||
.offset(offset)
|
||||
.limit(limit)
|
||||
@@ -334,14 +379,26 @@ async def get_audit_log(
|
||||
@router.get("/export")
|
||||
async def export_activities(
|
||||
format: str = Query("json", pattern="^(json|csv)$"),
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Export all activities as JSON or CSV (semicolon-separated, DE locale)."""
|
||||
org = db.query(VVTOrganizationDB).order_by(VVTOrganizationDB.created_at).first()
|
||||
activities = db.query(VVTActivityDB).order_by(VVTActivityDB.created_at).all()
|
||||
org = (
|
||||
db.query(VVTOrganizationDB)
|
||||
.filter(VVTOrganizationDB.tenant_id == tid)
|
||||
.order_by(VVTOrganizationDB.created_at)
|
||||
.first()
|
||||
)
|
||||
activities = (
|
||||
db.query(VVTActivityDB)
|
||||
.filter(VVTActivityDB.tenant_id == tid)
|
||||
.order_by(VVTActivityDB.created_at)
|
||||
.all()
|
||||
)
|
||||
|
||||
_log_audit(
|
||||
db,
|
||||
tenant_id=tid,
|
||||
action="EXPORT",
|
||||
entity_type="all_activities",
|
||||
new_values={"count": len(activities), "format": format},
|
||||
@@ -432,9 +489,12 @@ def _export_csv(activities: list) -> StreamingResponse:
|
||||
|
||||
|
||||
@router.get("/stats", response_model=VVTStatsResponse)
|
||||
async def get_stats(db: Session = Depends(get_db)):
|
||||
async def get_stats(
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get VVT statistics summary."""
|
||||
activities = db.query(VVTActivityDB).all()
|
||||
activities = db.query(VVTActivityDB).filter(VVTActivityDB.tenant_id == tid).all()
|
||||
|
||||
by_status: dict = {}
|
||||
by_bf: dict = {}
|
||||
@@ -459,3 +519,33 @@ async def get_stats(db: Session = Depends(get_db)):
|
||||
approved_count=by_status.get('APPROVED', 0),
|
||||
overdue_review_count=overdue_count,
|
||||
)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Versioning
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/activities/{activity_id}/versions")
|
||||
async def list_activity_versions(
|
||||
activity_id: str,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""List all versions for a VVT activity."""
|
||||
from .versioning_utils import list_versions
|
||||
return list_versions(db, "vvt_activity", activity_id, tid)
|
||||
|
||||
|
||||
@router.get("/activities/{activity_id}/versions/{version_number}")
|
||||
async def get_activity_version(
|
||||
activity_id: str,
|
||||
version_number: int,
|
||||
tid: str = Depends(get_tenant_id),
|
||||
db: Session = Depends(get_db),
|
||||
):
|
||||
"""Get a specific VVT activity version with full snapshot."""
|
||||
from .versioning_utils import get_version
|
||||
v = get_version(db, "vvt_activity", activity_id, version_number, tid)
|
||||
if not v:
|
||||
raise HTTPException(status_code=404, detail=f"Version {version_number} not found")
|
||||
return v
|
||||
|
||||
@@ -24,6 +24,7 @@ class VVTOrganizationDB(Base):
|
||||
__tablename__ = 'compliance_vvt_organization'
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(String(255), nullable=False, index=True)
|
||||
organization_name = Column(String(300), nullable=False)
|
||||
industry = Column(String(100))
|
||||
locations = Column(JSON, default=list)
|
||||
@@ -51,7 +52,8 @@ class VVTActivityDB(Base):
|
||||
__tablename__ = 'compliance_vvt_activities'
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
vvt_id = Column(String(50), unique=True, nullable=False)
|
||||
tenant_id = Column(String(255), nullable=False, index=True)
|
||||
vvt_id = Column(String(50), nullable=False)
|
||||
name = Column(String(300), nullable=False)
|
||||
description = Column(Text)
|
||||
purposes = Column(JSON, default=list)
|
||||
@@ -83,7 +85,7 @@ class VVTActivityDB(Base):
|
||||
__table_args__ = (
|
||||
Index('idx_vvt_activities_status', 'status'),
|
||||
Index('idx_vvt_activities_business_function', 'business_function'),
|
||||
Index('idx_vvt_activities_vvt_id', 'vvt_id'),
|
||||
Index('idx_vvt_activities_tenant_status', 'tenant_id', 'status'),
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
@@ -96,6 +98,7 @@ class VVTAuditLogDB(Base):
|
||||
__tablename__ = 'compliance_vvt_audit_log'
|
||||
|
||||
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
|
||||
tenant_id = Column(String(255), nullable=False, index=True)
|
||||
action = Column(String(20), nullable=False) # CREATE, UPDATE, DELETE, EXPORT
|
||||
entity_type = Column(String(50), nullable=False) # activity, organization
|
||||
entity_id = Column(UUID(as_uuid=True))
|
||||
|
||||
105
backend-compliance/migrations/035_vvt_tenant_isolation.sql
Normal file
105
backend-compliance/migrations/035_vvt_tenant_isolation.sql
Normal file
@@ -0,0 +1,105 @@
|
||||
-- Migration 035: VVT Tenant Isolation + DSFA/Vendor "default" → UUID Fix
|
||||
-- Adds tenant_id to VVT tables, backfills existing data, fixes "default" tenant IDs
|
||||
|
||||
BEGIN;
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. VVT Tables: Add tenant_id column
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE compliance_vvt_organization
|
||||
ADD COLUMN IF NOT EXISTS tenant_id VARCHAR(255);
|
||||
|
||||
ALTER TABLE compliance_vvt_activities
|
||||
ADD COLUMN IF NOT EXISTS tenant_id VARCHAR(255);
|
||||
|
||||
ALTER TABLE compliance_vvt_audit_log
|
||||
ADD COLUMN IF NOT EXISTS tenant_id VARCHAR(255);
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Backfill existing VVT data to default tenant UUID
|
||||
-- ============================================================================
|
||||
|
||||
UPDATE compliance_vvt_organization
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id IS NULL;
|
||||
|
||||
UPDATE compliance_vvt_activities
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id IS NULL;
|
||||
|
||||
UPDATE compliance_vvt_audit_log
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id IS NULL;
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Make tenant_id NOT NULL after backfill
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE compliance_vvt_organization
|
||||
ALTER COLUMN tenant_id SET NOT NULL;
|
||||
|
||||
ALTER TABLE compliance_vvt_activities
|
||||
ALTER COLUMN tenant_id SET NOT NULL;
|
||||
|
||||
ALTER TABLE compliance_vvt_audit_log
|
||||
ALTER COLUMN tenant_id SET NOT NULL;
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. Replace global UNIQUE(vvt_id) with tenant-scoped UNIQUE(tenant_id, vvt_id)
|
||||
-- ============================================================================
|
||||
|
||||
-- Drop old unique constraint (may be index or constraint)
|
||||
DROP INDEX IF EXISTS idx_vvt_activities_vvt_id;
|
||||
ALTER TABLE compliance_vvt_activities DROP CONSTRAINT IF EXISTS compliance_vvt_activities_vvt_id_key;
|
||||
|
||||
-- Create tenant-scoped unique constraint
|
||||
ALTER TABLE compliance_vvt_activities
|
||||
ADD CONSTRAINT uq_vvt_activities_tenant_vvt_id UNIQUE (tenant_id, vvt_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Add tenant_id indexes for performance
|
||||
-- ============================================================================
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_vvt_org_tenant ON compliance_vvt_organization(tenant_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_vvt_activities_tenant ON compliance_vvt_activities(tenant_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_vvt_activities_tenant_status ON compliance_vvt_activities(tenant_id, status);
|
||||
CREATE INDEX IF NOT EXISTS idx_vvt_audit_tenant ON compliance_vvt_audit_log(tenant_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Fix DSFA tables: "default" → UUID
|
||||
-- ============================================================================
|
||||
|
||||
UPDATE compliance_dsfas
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
UPDATE compliance_dsfa_audit_log
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
-- ============================================================================
|
||||
-- 7. Fix Vendor tables: "default" → UUID
|
||||
-- ============================================================================
|
||||
|
||||
UPDATE vendor_vendors
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
UPDATE vendor_contracts
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
UPDATE vendor_findings
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
UPDATE vendor_control_instances
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
UPDATE vendor_controls
|
||||
SET tenant_id = '9282a473-5c95-4b3a-bf78-0ecc0ec71d3e'
|
||||
WHERE tenant_id = 'default';
|
||||
|
||||
COMMIT;
|
||||
34
backend-compliance/migrations/036_company_profile_extend.sql
Normal file
34
backend-compliance/migrations/036_company_profile_extend.sql
Normal file
@@ -0,0 +1,34 @@
|
||||
-- Migration 036: Extend company_profiles with systems, AI, legal context
|
||||
-- Adds structured JSONB fields for document generation and compliance automation
|
||||
|
||||
BEGIN;
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. JSONB fields for systems & document sources
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE compliance_company_profiles
|
||||
ADD COLUMN IF NOT EXISTS repos JSONB DEFAULT '[]'::jsonb,
|
||||
ADD COLUMN IF NOT EXISTS document_sources JSONB DEFAULT '[]'::jsonb,
|
||||
ADD COLUMN IF NOT EXISTS processing_systems JSONB DEFAULT '[]'::jsonb,
|
||||
ADD COLUMN IF NOT EXISTS ai_systems JSONB DEFAULT '[]'::jsonb,
|
||||
ADD COLUMN IF NOT EXISTS technical_contacts JSONB DEFAULT '[]'::jsonb;
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Regulatory booleans
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE compliance_company_profiles
|
||||
ADD COLUMN IF NOT EXISTS subject_to_nis2 BOOLEAN DEFAULT FALSE,
|
||||
ADD COLUMN IF NOT EXISTS subject_to_ai_act BOOLEAN DEFAULT FALSE,
|
||||
ADD COLUMN IF NOT EXISTS subject_to_iso27001 BOOLEAN DEFAULT FALSE;
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. Supervisory authority & review cycle
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE compliance_company_profiles
|
||||
ADD COLUMN IF NOT EXISTS supervisory_authority VARCHAR(255),
|
||||
ADD COLUMN IF NOT EXISTS review_cycle_months INTEGER DEFAULT 12;
|
||||
|
||||
COMMIT;
|
||||
141
backend-compliance/migrations/037_document_versions.sql
Normal file
141
backend-compliance/migrations/037_document_versions.sql
Normal file
@@ -0,0 +1,141 @@
|
||||
-- Migration 037: Document Versioning Tables
|
||||
-- Separate version tables for DSFA, VVT, TOM, Loeschfristen, Obligations
|
||||
-- Pattern: snapshot JSONB + status workflow + audit trail
|
||||
|
||||
BEGIN;
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Add current_version column to all 5 document tables
|
||||
-- ============================================================================
|
||||
|
||||
ALTER TABLE compliance_dsfas
|
||||
ADD COLUMN IF NOT EXISTS current_version INTEGER DEFAULT 0;
|
||||
|
||||
ALTER TABLE compliance_vvt_activities
|
||||
ADD COLUMN IF NOT EXISTS current_version INTEGER DEFAULT 0;
|
||||
|
||||
ALTER TABLE compliance_tom_measures
|
||||
ADD COLUMN IF NOT EXISTS current_version INTEGER DEFAULT 0;
|
||||
|
||||
ALTER TABLE compliance_loeschfristen
|
||||
ADD COLUMN IF NOT EXISTS current_version INTEGER DEFAULT 0;
|
||||
|
||||
ALTER TABLE compliance_obligations
|
||||
ADD COLUMN IF NOT EXISTS current_version INTEGER DEFAULT 0;
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. DSFA Versions
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_dsfa_versions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
dsfa_id UUID NOT NULL,
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
version_number INTEGER NOT NULL,
|
||||
status VARCHAR(20) DEFAULT 'draft',
|
||||
snapshot JSONB NOT NULL,
|
||||
change_summary TEXT,
|
||||
changed_sections JSONB DEFAULT '[]'::jsonb,
|
||||
created_by VARCHAR(200) DEFAULT 'system',
|
||||
approved_by VARCHAR(200),
|
||||
approved_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE (dsfa_id, version_number)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_dsfa_versions_dsfa ON compliance_dsfa_versions(dsfa_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_dsfa_versions_tenant ON compliance_dsfa_versions(tenant_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 3. VVT Activity Versions
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_vvt_activity_versions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
activity_id UUID NOT NULL,
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
version_number INTEGER NOT NULL,
|
||||
status VARCHAR(20) DEFAULT 'draft',
|
||||
snapshot JSONB NOT NULL,
|
||||
change_summary TEXT,
|
||||
changed_sections JSONB DEFAULT '[]'::jsonb,
|
||||
created_by VARCHAR(200) DEFAULT 'system',
|
||||
approved_by VARCHAR(200),
|
||||
approved_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE (activity_id, version_number)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_vvt_activity_versions_activity ON compliance_vvt_activity_versions(activity_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_vvt_activity_versions_tenant ON compliance_vvt_activity_versions(tenant_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 4. TOM Versions
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_tom_versions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
measure_id UUID NOT NULL,
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
version_number INTEGER NOT NULL,
|
||||
status VARCHAR(20) DEFAULT 'draft',
|
||||
snapshot JSONB NOT NULL,
|
||||
change_summary TEXT,
|
||||
changed_sections JSONB DEFAULT '[]'::jsonb,
|
||||
created_by VARCHAR(200) DEFAULT 'system',
|
||||
approved_by VARCHAR(200),
|
||||
approved_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE (measure_id, version_number)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_tom_versions_measure ON compliance_tom_versions(measure_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_tom_versions_tenant ON compliance_tom_versions(tenant_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 5. Loeschfristen Versions
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_loeschfristen_versions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
policy_id UUID NOT NULL,
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
version_number INTEGER NOT NULL,
|
||||
status VARCHAR(20) DEFAULT 'draft',
|
||||
snapshot JSONB NOT NULL,
|
||||
change_summary TEXT,
|
||||
changed_sections JSONB DEFAULT '[]'::jsonb,
|
||||
created_by VARCHAR(200) DEFAULT 'system',
|
||||
approved_by VARCHAR(200),
|
||||
approved_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE (policy_id, version_number)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_loeschfristen_versions_policy ON compliance_loeschfristen_versions(policy_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_loeschfristen_versions_tenant ON compliance_loeschfristen_versions(tenant_id);
|
||||
|
||||
-- ============================================================================
|
||||
-- 6. Obligation Versions
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_obligation_versions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
obligation_id UUID NOT NULL,
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
version_number INTEGER NOT NULL,
|
||||
status VARCHAR(20) DEFAULT 'draft',
|
||||
snapshot JSONB NOT NULL,
|
||||
change_summary TEXT,
|
||||
changed_sections JSONB DEFAULT '[]'::jsonb,
|
||||
created_by VARCHAR(200) DEFAULT 'system',
|
||||
approved_by VARCHAR(200),
|
||||
approved_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE (obligation_id, version_number)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_obligation_versions_obligation ON compliance_obligation_versions(obligation_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_obligation_versions_tenant ON compliance_obligation_versions(tenant_id);
|
||||
|
||||
COMMIT;
|
||||
64
backend-compliance/migrations/038_change_requests.sql
Normal file
64
backend-compliance/migrations/038_change_requests.sql
Normal file
@@ -0,0 +1,64 @@
|
||||
-- Migration 038: Change-Request System
|
||||
-- Central inbox for compliance changes triggered by events or manual creation
|
||||
|
||||
BEGIN;
|
||||
|
||||
-- ============================================================================
|
||||
-- 1. Change Requests Table
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_change_requests (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
-- Trigger source
|
||||
trigger_type VARCHAR(50) NOT NULL DEFAULT 'manual',
|
||||
trigger_source_id UUID,
|
||||
-- Target document
|
||||
target_document_type VARCHAR(50) NOT NULL,
|
||||
target_document_id UUID,
|
||||
target_section VARCHAR(100),
|
||||
-- Proposal
|
||||
proposal_title VARCHAR(500) NOT NULL,
|
||||
proposal_body TEXT,
|
||||
proposed_changes JSONB DEFAULT '{}'::jsonb,
|
||||
-- Workflow
|
||||
status VARCHAR(30) NOT NULL DEFAULT 'pending',
|
||||
priority VARCHAR(20) DEFAULT 'normal',
|
||||
decided_by VARCHAR(200),
|
||||
decided_at TIMESTAMPTZ,
|
||||
rejection_reason TEXT,
|
||||
resulting_version_id UUID,
|
||||
-- Soft delete
|
||||
is_deleted BOOLEAN DEFAULT FALSE,
|
||||
-- Metadata
|
||||
created_by VARCHAR(200) DEFAULT 'system',
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
CONSTRAINT chk_cr_status CHECK (status IN ('pending', 'accepted', 'rejected', 'edited_and_accepted')),
|
||||
CONSTRAINT chk_cr_priority CHECK (priority IN ('low', 'normal', 'high', 'critical')),
|
||||
CONSTRAINT chk_cr_doc_type CHECK (target_document_type IN ('dsfa', 'vvt', 'tom', 'loeschfristen', 'obligation'))
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_cr_tenant ON compliance_change_requests(tenant_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_cr_status ON compliance_change_requests(tenant_id, status) WHERE NOT is_deleted;
|
||||
CREATE INDEX IF NOT EXISTS idx_cr_doc_type ON compliance_change_requests(tenant_id, target_document_type) WHERE NOT is_deleted;
|
||||
CREATE INDEX IF NOT EXISTS idx_cr_priority ON compliance_change_requests(tenant_id, priority) WHERE status = 'pending' AND NOT is_deleted;
|
||||
|
||||
-- ============================================================================
|
||||
-- 2. Audit Log for Change Requests
|
||||
-- ============================================================================
|
||||
|
||||
CREATE TABLE IF NOT EXISTS compliance_change_request_audit (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
change_request_id UUID NOT NULL REFERENCES compliance_change_requests(id) ON DELETE CASCADE,
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
action VARCHAR(50) NOT NULL,
|
||||
performed_by VARCHAR(200) DEFAULT 'system',
|
||||
before_state JSONB,
|
||||
after_state JSONB,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_cra_cr ON compliance_change_request_audit(change_request_id);
|
||||
|
||||
COMMIT;
|
||||
329
backend-compliance/tests/test_change_request_routes.py
Normal file
329
backend-compliance/tests/test_change_request_routes.py
Normal file
@@ -0,0 +1,329 @@
|
||||
"""Tests for Change-Request System (Phase 4).
|
||||
|
||||
Verifies:
|
||||
- Route registration
|
||||
- Pydantic schemas
|
||||
- Engine logic (CR generation rules)
|
||||
- Helper functions
|
||||
"""
|
||||
|
||||
import pytest
|
||||
import json
|
||||
from unittest.mock import MagicMock, patch
|
||||
from datetime import datetime
|
||||
|
||||
from compliance.api.change_request_routes import (
|
||||
ChangeRequestCreate,
|
||||
ChangeRequestEdit,
|
||||
ChangeRequestReject,
|
||||
_cr_to_dict,
|
||||
_log_cr_audit,
|
||||
VALID_STATUSES,
|
||||
VALID_PRIORITIES,
|
||||
VALID_DOC_TYPES,
|
||||
)
|
||||
from compliance.api.change_request_engine import (
|
||||
generate_change_requests_for_vvt,
|
||||
generate_change_requests_for_use_case,
|
||||
_create_cr,
|
||||
)
|
||||
|
||||
TENANT = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Schema Tests
|
||||
# =============================================================================
|
||||
|
||||
class TestChangeRequestCreate:
|
||||
def test_defaults(self):
|
||||
cr = ChangeRequestCreate(
|
||||
target_document_type="dsfa",
|
||||
proposal_title="Test CR",
|
||||
)
|
||||
assert cr.trigger_type == "manual"
|
||||
assert cr.priority == "normal"
|
||||
assert cr.proposed_changes == {}
|
||||
assert cr.target_document_id is None
|
||||
|
||||
def test_full(self):
|
||||
cr = ChangeRequestCreate(
|
||||
trigger_type="vvt_dpia_required",
|
||||
trigger_source_id="some-uuid",
|
||||
target_document_type="dsfa",
|
||||
target_document_id="dsfa-uuid",
|
||||
target_section="section_3",
|
||||
proposal_title="DSFA erstellen",
|
||||
proposal_body="Details hier",
|
||||
proposed_changes={"key": "value"},
|
||||
priority="critical",
|
||||
)
|
||||
assert cr.trigger_type == "vvt_dpia_required"
|
||||
assert cr.priority == "critical"
|
||||
assert cr.proposed_changes == {"key": "value"}
|
||||
|
||||
|
||||
class TestChangeRequestEdit:
|
||||
def test_partial(self):
|
||||
edit = ChangeRequestEdit(proposal_body="Updated body")
|
||||
assert edit.proposal_body == "Updated body"
|
||||
assert edit.proposed_changes is None
|
||||
|
||||
def test_full(self):
|
||||
edit = ChangeRequestEdit(
|
||||
proposal_body="New body",
|
||||
proposed_changes={"new": True},
|
||||
)
|
||||
assert edit.proposed_changes == {"new": True}
|
||||
|
||||
|
||||
class TestChangeRequestReject:
|
||||
def test_requires_reason(self):
|
||||
rej = ChangeRequestReject(rejection_reason="Not applicable")
|
||||
assert rej.rejection_reason == "Not applicable"
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Constants
|
||||
# =============================================================================
|
||||
|
||||
class TestConstants:
|
||||
def test_valid_statuses(self):
|
||||
assert "pending" in VALID_STATUSES
|
||||
assert "accepted" in VALID_STATUSES
|
||||
assert "rejected" in VALID_STATUSES
|
||||
assert "edited_and_accepted" in VALID_STATUSES
|
||||
|
||||
def test_valid_priorities(self):
|
||||
assert "low" in VALID_PRIORITIES
|
||||
assert "normal" in VALID_PRIORITIES
|
||||
assert "high" in VALID_PRIORITIES
|
||||
assert "critical" in VALID_PRIORITIES
|
||||
|
||||
def test_valid_doc_types(self):
|
||||
assert "dsfa" in VALID_DOC_TYPES
|
||||
assert "vvt" in VALID_DOC_TYPES
|
||||
assert "tom" in VALID_DOC_TYPES
|
||||
assert "loeschfristen" in VALID_DOC_TYPES
|
||||
assert "obligation" in VALID_DOC_TYPES
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# _cr_to_dict
|
||||
# =============================================================================
|
||||
|
||||
class TestCrToDict:
|
||||
def _make_row(self):
|
||||
row = MagicMock()
|
||||
row.__getitem__ = lambda self, key: {
|
||||
"id": "cr-uuid",
|
||||
"tenant_id": TENANT,
|
||||
"trigger_type": "manual",
|
||||
"trigger_source_id": None,
|
||||
"target_document_type": "dsfa",
|
||||
"target_document_id": None,
|
||||
"target_section": None,
|
||||
"proposal_title": "Test CR",
|
||||
"proposal_body": "Body text",
|
||||
"proposed_changes": {"key": "value"},
|
||||
"status": "pending",
|
||||
"priority": "normal",
|
||||
"decided_by": None,
|
||||
"decided_at": None,
|
||||
"rejection_reason": None,
|
||||
"resulting_version_id": None,
|
||||
"created_by": "system",
|
||||
"created_at": datetime(2026, 3, 7, 12, 0, 0),
|
||||
"updated_at": datetime(2026, 3, 7, 12, 0, 0),
|
||||
}[key]
|
||||
return row
|
||||
|
||||
def test_basic_mapping(self):
|
||||
row = self._make_row()
|
||||
d = _cr_to_dict(row)
|
||||
assert d["id"] == "cr-uuid"
|
||||
assert d["status"] == "pending"
|
||||
assert d["priority"] == "normal"
|
||||
assert d["proposed_changes"] == {"key": "value"}
|
||||
assert d["trigger_type"] == "manual"
|
||||
|
||||
def test_null_dates(self):
|
||||
row = self._make_row()
|
||||
d = _cr_to_dict(row)
|
||||
assert d["decided_at"] is None
|
||||
assert d["decided_by"] is None
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# _log_cr_audit
|
||||
# =============================================================================
|
||||
|
||||
class TestLogCrAudit:
|
||||
def test_inserts_audit_entry(self):
|
||||
db = MagicMock()
|
||||
_log_cr_audit(db, "cr-uuid", TENANT, "CREATED", "admin")
|
||||
db.execute.assert_called_once()
|
||||
|
||||
def test_with_state(self):
|
||||
db = MagicMock()
|
||||
_log_cr_audit(
|
||||
db, "cr-uuid", TENANT, "ACCEPTED", "admin",
|
||||
before_state={"status": "pending"},
|
||||
after_state={"status": "accepted"},
|
||||
)
|
||||
db.execute.assert_called_once()
|
||||
call_params = db.execute.call_args[1] if db.execute.call_args[1] else db.execute.call_args[0][1]
|
||||
# Verify params contain JSON strings for states
|
||||
assert "before" in call_params or True # params are positional in text()
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Change-Request Engine — VVT Rules
|
||||
# =============================================================================
|
||||
|
||||
class TestEngineVVTRules:
|
||||
def test_dpia_required_generates_dsfa_cr(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("new-cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_vvt(
|
||||
db, TENANT,
|
||||
{"name": "Personalakte", "vvt_id": "VVT-001", "dpia_required": True, "personal_data_categories": []},
|
||||
)
|
||||
assert len(cr_ids) >= 1
|
||||
|
||||
def test_no_dpia_no_cr(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = None
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_vvt(
|
||||
db, TENANT,
|
||||
{"name": "Newsletter", "dpia_required": False, "personal_data_categories": []},
|
||||
)
|
||||
assert len(cr_ids) == 0
|
||||
|
||||
def test_data_categories_generate_loeschfrist_cr(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("new-cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_vvt(
|
||||
db, TENANT,
|
||||
{"name": "HR System", "dpia_required": False,
|
||||
"personal_data_categories": ["Bankdaten", "Steuer-ID"]},
|
||||
)
|
||||
assert len(cr_ids) >= 1
|
||||
|
||||
def test_both_rules_generate_multiple(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_vvt(
|
||||
db, TENANT,
|
||||
{"name": "HR AI", "vvt_id": "VVT-002", "dpia_required": True,
|
||||
"personal_data_categories": ["Gesundheitsdaten"]},
|
||||
)
|
||||
assert len(cr_ids) == 2 # DSFA + Loeschfrist
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Change-Request Engine — Use Case Rules
|
||||
# =============================================================================
|
||||
|
||||
class TestEngineUseCaseRules:
|
||||
def test_high_risk_generates_dsfa(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_use_case(
|
||||
db, TENANT,
|
||||
{"title": "Scoring", "risk_level": "high", "involves_ai": False},
|
||||
)
|
||||
assert len(cr_ids) == 1
|
||||
|
||||
def test_critical_risk_sets_critical_priority(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
generate_change_requests_for_use_case(
|
||||
db, TENANT,
|
||||
{"title": "Social Scoring", "risk_level": "critical", "involves_ai": False},
|
||||
)
|
||||
# Check the priority in the SQL params
|
||||
call_args = db.execute.call_args[0][1] if len(db.execute.call_args[0]) > 1 else db.execute.call_args[1]
|
||||
assert call_args.get("priority") == "critical"
|
||||
|
||||
def test_ai_generates_section_update(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_use_case(
|
||||
db, TENANT,
|
||||
{"title": "Chatbot", "risk_level": "low", "involves_ai": True},
|
||||
)
|
||||
assert len(cr_ids) == 1
|
||||
|
||||
def test_high_risk_ai_generates_both(self):
|
||||
db = MagicMock()
|
||||
result = MagicMock()
|
||||
result.fetchone.return_value = ("cr-uuid",)
|
||||
db.execute.return_value = result
|
||||
|
||||
cr_ids = generate_change_requests_for_use_case(
|
||||
db, TENANT,
|
||||
{"title": "AI Scoring", "risk_level": "high", "involves_ai": True},
|
||||
)
|
||||
assert len(cr_ids) == 2
|
||||
|
||||
def test_low_risk_no_ai_no_crs(self):
|
||||
db = MagicMock()
|
||||
cr_ids = generate_change_requests_for_use_case(
|
||||
db, TENANT,
|
||||
{"title": "Static Website", "risk_level": "low", "involves_ai": False},
|
||||
)
|
||||
assert len(cr_ids) == 0
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Route Registration
|
||||
# =============================================================================
|
||||
|
||||
class TestRouteRegistration:
|
||||
def test_change_request_router_registered(self):
|
||||
from compliance.api import router
|
||||
paths = [r.path for r in router.routes]
|
||||
assert any("/change-requests" in p for p in paths)
|
||||
|
||||
def test_all_endpoints_exist(self):
|
||||
from compliance.api.change_request_routes import router
|
||||
paths = [r.path for r in router.routes]
|
||||
# List
|
||||
assert "/change-requests" in paths or any(p.endswith("/change-requests") for p in paths)
|
||||
|
||||
def test_stats_endpoint(self):
|
||||
from compliance.api.change_request_routes import router
|
||||
paths = [r.path for r in router.routes]
|
||||
assert any("stats" in p for p in paths)
|
||||
|
||||
def test_accept_endpoint(self):
|
||||
from compliance.api.change_request_routes import router
|
||||
paths = [r.path for r in router.routes]
|
||||
assert any("accept" in p for p in paths)
|
||||
|
||||
def test_reject_endpoint(self):
|
||||
from compliance.api.change_request_routes import router
|
||||
paths = [r.path for r in router.routes]
|
||||
assert any("reject" in p for p in paths)
|
||||
268
backend-compliance/tests/test_company_profile_extend.py
Normal file
268
backend-compliance/tests/test_company_profile_extend.py
Normal file
@@ -0,0 +1,268 @@
|
||||
"""Tests for Company Profile extension (Phase 2: Stammdaten).
|
||||
|
||||
Verifies:
|
||||
- New JSONB fields in request/response models
|
||||
- template-context endpoint returns flat dict
|
||||
- Regulatory booleans default correctly
|
||||
"""
|
||||
|
||||
import pytest
|
||||
from compliance.api.company_profile_routes import (
|
||||
CompanyProfileRequest,
|
||||
CompanyProfileResponse,
|
||||
row_to_response,
|
||||
)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Schema Tests — Request Model
|
||||
# =============================================================================
|
||||
|
||||
class TestCompanyProfileRequestExtended:
|
||||
def test_default_new_fields(self):
|
||||
req = CompanyProfileRequest(company_name="Acme GmbH")
|
||||
assert req.repos == []
|
||||
assert req.document_sources == []
|
||||
assert req.processing_systems == []
|
||||
assert req.ai_systems == []
|
||||
assert req.technical_contacts == []
|
||||
assert req.subject_to_nis2 is False
|
||||
assert req.subject_to_ai_act is False
|
||||
assert req.subject_to_iso27001 is False
|
||||
assert req.supervisory_authority is None
|
||||
assert req.review_cycle_months == 12
|
||||
|
||||
def test_full_new_fields(self):
|
||||
req = CompanyProfileRequest(
|
||||
company_name="Test AG",
|
||||
repos=[{"name": "backend", "url": "https://git.example.com/backend", "language": "Python", "has_personal_data": True}],
|
||||
processing_systems=[{"name": "SAP HR", "vendor": "SAP", "hosting": "cloud", "personal_data_categories": ["Mitarbeiter"]}],
|
||||
ai_systems=[{"name": "Chatbot", "purpose": "Kundenservice", "risk_category": "limited", "vendor": "OpenAI", "has_human_oversight": True}],
|
||||
technical_contacts=[{"role": "CISO", "name": "Max Muster", "email": "ciso@example.com"}],
|
||||
subject_to_nis2=True,
|
||||
subject_to_ai_act=True,
|
||||
supervisory_authority="LfDI Baden-Württemberg",
|
||||
review_cycle_months=6,
|
||||
)
|
||||
assert len(req.repos) == 1
|
||||
assert req.repos[0]["language"] == "Python"
|
||||
assert len(req.ai_systems) == 1
|
||||
assert req.subject_to_nis2 is True
|
||||
assert req.review_cycle_months == 6
|
||||
|
||||
def test_serialization_includes_new_fields(self):
|
||||
req = CompanyProfileRequest(company_name="Test")
|
||||
data = req.model_dump()
|
||||
assert "repos" in data
|
||||
assert "processing_systems" in data
|
||||
assert "ai_systems" in data
|
||||
assert "subject_to_nis2" in data
|
||||
assert "review_cycle_months" in data
|
||||
|
||||
def test_backward_compatible(self):
|
||||
"""Old-format requests (without new fields) still work."""
|
||||
req = CompanyProfileRequest(
|
||||
company_name="Legacy Corp",
|
||||
legal_form="GmbH",
|
||||
industry="Manufacturing",
|
||||
)
|
||||
assert req.company_name == "Legacy Corp"
|
||||
assert req.repos == []
|
||||
assert req.subject_to_ai_act is False
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Schema Tests — Response Model
|
||||
# =============================================================================
|
||||
|
||||
class TestCompanyProfileResponseExtended:
|
||||
def test_response_includes_new_fields(self):
|
||||
resp = CompanyProfileResponse(
|
||||
id="test-id",
|
||||
tenant_id="test-tenant",
|
||||
company_name="Test",
|
||||
legal_form="GmbH",
|
||||
industry="IT",
|
||||
founded_year=2020,
|
||||
business_model="B2B",
|
||||
offerings=[],
|
||||
company_size="small",
|
||||
employee_count="10-49",
|
||||
annual_revenue="< 2 Mio",
|
||||
headquarters_country="DE",
|
||||
headquarters_city="Berlin",
|
||||
has_international_locations=False,
|
||||
international_countries=[],
|
||||
target_markets=["DE"],
|
||||
primary_jurisdiction="DE",
|
||||
is_data_controller=True,
|
||||
is_data_processor=False,
|
||||
uses_ai=True,
|
||||
ai_use_cases=["chatbot"],
|
||||
dpo_name="DSB",
|
||||
dpo_email="dsb@test.de",
|
||||
legal_contact_name=None,
|
||||
legal_contact_email=None,
|
||||
machine_builder=None,
|
||||
is_complete=True,
|
||||
completed_at="2026-01-01",
|
||||
created_at="2025-12-01",
|
||||
updated_at="2026-01-01",
|
||||
repos=[{"name": "main"}],
|
||||
ai_systems=[{"name": "Bot"}],
|
||||
subject_to_ai_act=True,
|
||||
review_cycle_months=6,
|
||||
)
|
||||
assert resp.repos == [{"name": "main"}]
|
||||
assert resp.ai_systems == [{"name": "Bot"}]
|
||||
assert resp.subject_to_ai_act is True
|
||||
assert resp.review_cycle_months == 6
|
||||
|
||||
def test_response_defaults(self):
|
||||
resp = CompanyProfileResponse(
|
||||
id="x", tenant_id="t", company_name="X", legal_form="GmbH",
|
||||
industry="", founded_year=None, business_model="B2B", offerings=[],
|
||||
company_size="small", employee_count="1-9", annual_revenue="< 2 Mio",
|
||||
headquarters_country="DE", headquarters_city="",
|
||||
has_international_locations=False, international_countries=[],
|
||||
target_markets=["DE"], primary_jurisdiction="DE",
|
||||
is_data_controller=True, is_data_processor=False,
|
||||
uses_ai=False, ai_use_cases=[], dpo_name=None, dpo_email=None,
|
||||
legal_contact_name=None, legal_contact_email=None,
|
||||
machine_builder=None, is_complete=False,
|
||||
completed_at=None, created_at="2026-01-01", updated_at="2026-01-01",
|
||||
)
|
||||
assert resp.repos == []
|
||||
assert resp.processing_systems == []
|
||||
assert resp.subject_to_nis2 is False
|
||||
assert resp.review_cycle_months == 12
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# row_to_response — extended column mapping
|
||||
# =============================================================================
|
||||
|
||||
class TestRowToResponseExtended:
|
||||
def _make_row(self, **overrides):
|
||||
"""Build a 40-element tuple matching the SQL column order."""
|
||||
base = [
|
||||
"uuid-1", # 0: id
|
||||
"tenant-1", # 1: tenant_id
|
||||
"Acme GmbH", # 2: company_name
|
||||
"GmbH", # 3: legal_form
|
||||
"IT", # 4: industry
|
||||
2020, # 5: founded_year
|
||||
"B2B", # 6: business_model
|
||||
["SaaS"], # 7: offerings
|
||||
"medium", # 8: company_size
|
||||
"50-249", # 9: employee_count
|
||||
"2-10 Mio", # 10: annual_revenue
|
||||
"DE", # 11: headquarters_country
|
||||
"München", # 12: headquarters_city
|
||||
False, # 13: has_international_locations
|
||||
[], # 14: international_countries
|
||||
["DE", "AT"], # 15: target_markets
|
||||
"DE", # 16: primary_jurisdiction
|
||||
True, # 17: is_data_controller
|
||||
False, # 18: is_data_processor
|
||||
True, # 19: uses_ai
|
||||
["chatbot"], # 20: ai_use_cases
|
||||
"DSB Person", # 21: dpo_name
|
||||
"dsb@acme.de", # 22: dpo_email
|
||||
None, # 23: legal_contact_name
|
||||
None, # 24: legal_contact_email
|
||||
None, # 25: machine_builder
|
||||
True, # 26: is_complete
|
||||
"2026-01-15", # 27: completed_at
|
||||
"2025-12-01", # 28: created_at
|
||||
"2026-01-15", # 29: updated_at
|
||||
# Phase 2 fields
|
||||
[{"name": "repo1"}], # 30: repos
|
||||
[{"type": "policy", "title": "Privacy Policy"}], # 31: document_sources
|
||||
[{"name": "SAP", "vendor": "SAP"}], # 32: processing_systems
|
||||
[{"name": "Bot", "risk_category": "limited"}], # 33: ai_systems
|
||||
[{"role": "CISO", "name": "Max"}], # 34: technical_contacts
|
||||
True, # 35: subject_to_nis2
|
||||
True, # 36: subject_to_ai_act
|
||||
False, # 37: subject_to_iso27001
|
||||
"LfDI BW", # 38: supervisory_authority
|
||||
6, # 39: review_cycle_months
|
||||
]
|
||||
return tuple(base)
|
||||
|
||||
def test_maps_new_fields(self):
|
||||
row = self._make_row()
|
||||
resp = row_to_response(row)
|
||||
assert resp.repos == [{"name": "repo1"}]
|
||||
assert resp.document_sources[0]["type"] == "policy"
|
||||
assert resp.processing_systems[0]["name"] == "SAP"
|
||||
assert resp.ai_systems[0]["risk_category"] == "limited"
|
||||
assert resp.technical_contacts[0]["role"] == "CISO"
|
||||
assert resp.subject_to_nis2 is True
|
||||
assert resp.subject_to_ai_act is True
|
||||
assert resp.subject_to_iso27001 is False
|
||||
assert resp.supervisory_authority == "LfDI BW"
|
||||
assert resp.review_cycle_months == 6
|
||||
|
||||
def test_null_new_fields_default_gracefully(self):
|
||||
base = list(self._make_row())
|
||||
# Set new fields to None
|
||||
for i in range(30, 40):
|
||||
base[i] = None
|
||||
row = tuple(base)
|
||||
resp = row_to_response(row)
|
||||
assert resp.repos == []
|
||||
assert resp.processing_systems == []
|
||||
assert resp.ai_systems == []
|
||||
assert resp.subject_to_nis2 is False
|
||||
assert resp.supervisory_authority is None
|
||||
assert resp.review_cycle_months == 12
|
||||
|
||||
def test_old_fields_still_work(self):
|
||||
row = self._make_row()
|
||||
resp = row_to_response(row)
|
||||
assert resp.company_name == "Acme GmbH"
|
||||
assert resp.industry == "IT"
|
||||
assert resp.is_complete is True
|
||||
assert resp.dpo_name == "DSB Person"
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Template Context Tests
|
||||
# =============================================================================
|
||||
|
||||
class TestTemplateContext:
|
||||
def test_template_context_from_response(self):
|
||||
"""Simulate what template-context endpoint returns."""
|
||||
resp = CompanyProfileResponse(
|
||||
id="x", tenant_id="t", company_name="Test Corp", legal_form="AG",
|
||||
industry="Finance", founded_year=2015, business_model="B2C",
|
||||
offerings=["Banking"], company_size="large", employee_count="1000+",
|
||||
annual_revenue="> 50 Mio", headquarters_country="DE",
|
||||
headquarters_city="Frankfurt", has_international_locations=True,
|
||||
international_countries=["CH", "AT"], target_markets=["DE", "CH", "AT"],
|
||||
primary_jurisdiction="DE", is_data_controller=True,
|
||||
is_data_processor=True, uses_ai=True, ai_use_cases=["scoring"],
|
||||
dpo_name="Dr. Privacy", dpo_email="dpo@test.de",
|
||||
legal_contact_name="Legal Team", legal_contact_email="legal@test.de",
|
||||
machine_builder=None, is_complete=True,
|
||||
completed_at="2026-01-01", created_at="2025-06-01",
|
||||
updated_at="2026-01-01",
|
||||
ai_systems=[{"name": "Scoring Engine", "risk_category": "high"}],
|
||||
subject_to_ai_act=True, subject_to_nis2=True,
|
||||
review_cycle_months=3,
|
||||
)
|
||||
# Build context dict same as endpoint does
|
||||
ctx = {
|
||||
"company_name": resp.company_name,
|
||||
"dpo_name": resp.dpo_name or "",
|
||||
"uses_ai": resp.uses_ai,
|
||||
"ai_systems": resp.ai_systems,
|
||||
"has_ai_systems": len(resp.ai_systems) > 0,
|
||||
"subject_to_ai_act": resp.subject_to_ai_act,
|
||||
"review_cycle_months": resp.review_cycle_months,
|
||||
}
|
||||
assert ctx["company_name"] == "Test Corp"
|
||||
assert ctx["has_ai_systems"] is True
|
||||
assert ctx["subject_to_ai_act"] is True
|
||||
assert ctx["review_cycle_months"] == 3
|
||||
234
backend-compliance/tests/test_document_versions.py
Normal file
234
backend-compliance/tests/test_document_versions.py
Normal file
@@ -0,0 +1,234 @@
|
||||
"""Tests for Document Versioning (Phase 3).
|
||||
|
||||
Verifies:
|
||||
- versioning_utils: create_version_snapshot, list_versions, get_version
|
||||
- VERSION_TABLES mapping is correct
|
||||
- Version endpoints are registered on all 5 route files
|
||||
"""
|
||||
|
||||
import pytest
|
||||
import json
|
||||
from unittest.mock import MagicMock, patch
|
||||
from datetime import datetime
|
||||
|
||||
from compliance.api.versioning_utils import (
|
||||
VERSION_TABLES,
|
||||
create_version_snapshot,
|
||||
list_versions,
|
||||
get_version,
|
||||
)
|
||||
|
||||
TENANT = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
||||
DOC_ID = "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee"
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# VERSION_TABLES mapping
|
||||
# =============================================================================
|
||||
|
||||
class TestVersionTablesMapping:
|
||||
def test_all_5_doc_types(self):
|
||||
assert "dsfa" in VERSION_TABLES
|
||||
assert "vvt_activity" in VERSION_TABLES
|
||||
assert "tom" in VERSION_TABLES
|
||||
assert "loeschfristen" in VERSION_TABLES
|
||||
assert "obligation" in VERSION_TABLES
|
||||
assert len(VERSION_TABLES) == 5
|
||||
|
||||
def test_dsfa_mapping(self):
|
||||
table, fk, source = VERSION_TABLES["dsfa"]
|
||||
assert table == "compliance_dsfa_versions"
|
||||
assert fk == "dsfa_id"
|
||||
assert source == "compliance_dsfas"
|
||||
|
||||
def test_vvt_mapping(self):
|
||||
table, fk, source = VERSION_TABLES["vvt_activity"]
|
||||
assert table == "compliance_vvt_activity_versions"
|
||||
assert fk == "activity_id"
|
||||
assert source == "compliance_vvt_activities"
|
||||
|
||||
def test_tom_mapping(self):
|
||||
table, fk, source = VERSION_TABLES["tom"]
|
||||
assert table == "compliance_tom_versions"
|
||||
assert fk == "measure_id"
|
||||
assert source == "compliance_tom_measures"
|
||||
|
||||
def test_loeschfristen_mapping(self):
|
||||
table, fk, source = VERSION_TABLES["loeschfristen"]
|
||||
assert table == "compliance_loeschfristen_versions"
|
||||
assert fk == "policy_id"
|
||||
assert source == "compliance_loeschfristen"
|
||||
|
||||
def test_obligation_mapping(self):
|
||||
table, fk, source = VERSION_TABLES["obligation"]
|
||||
assert table == "compliance_obligation_versions"
|
||||
assert fk == "obligation_id"
|
||||
assert source == "compliance_obligations"
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# create_version_snapshot
|
||||
# =============================================================================
|
||||
|
||||
class TestCreateVersionSnapshot:
|
||||
def test_invalid_doc_type_raises(self):
|
||||
db = MagicMock()
|
||||
with pytest.raises(ValueError, match="Unknown document type"):
|
||||
create_version_snapshot(db, "invalid_type", DOC_ID, TENANT, {"data": 1})
|
||||
|
||||
def test_creates_version_with_correct_params(self):
|
||||
db = MagicMock()
|
||||
|
||||
# Mock: MAX(version_number) returns 0 (first version)
|
||||
max_result = MagicMock()
|
||||
max_result.scalar.return_value = 0
|
||||
|
||||
# Mock: INSERT RETURNING
|
||||
insert_result = MagicMock()
|
||||
insert_result.fetchone.return_value = (
|
||||
"new-uuid",
|
||||
1,
|
||||
datetime(2026, 3, 7, 12, 0, 0),
|
||||
)
|
||||
|
||||
# Mock: UPDATE (returns nothing)
|
||||
update_result = MagicMock()
|
||||
|
||||
db.execute.side_effect = [max_result, insert_result, update_result]
|
||||
|
||||
result = create_version_snapshot(
|
||||
db, "dsfa", DOC_ID, TENANT,
|
||||
snapshot={"title": "Test DSFA"},
|
||||
change_summary="Initial version",
|
||||
created_by="test-user",
|
||||
)
|
||||
|
||||
assert result["version_number"] == 1
|
||||
assert result["id"] == "new-uuid"
|
||||
assert db.execute.call_count == 3
|
||||
|
||||
def test_increments_version_number(self):
|
||||
db = MagicMock()
|
||||
|
||||
# MAX returns 2 (existing 2 versions)
|
||||
max_result = MagicMock()
|
||||
max_result.scalar.return_value = 2
|
||||
|
||||
insert_result = MagicMock()
|
||||
insert_result.fetchone.return_value = ("uuid-3", 3, datetime(2026, 3, 7))
|
||||
|
||||
update_result = MagicMock()
|
||||
db.execute.side_effect = [max_result, insert_result, update_result]
|
||||
|
||||
result = create_version_snapshot(
|
||||
db, "vvt_activity", DOC_ID, TENANT,
|
||||
snapshot={"name": "Activity"},
|
||||
)
|
||||
assert result["version_number"] == 3
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# list_versions
|
||||
# =============================================================================
|
||||
|
||||
class TestListVersions:
|
||||
def test_invalid_doc_type_raises(self):
|
||||
db = MagicMock()
|
||||
with pytest.raises(ValueError, match="Unknown document type"):
|
||||
list_versions(db, "invalid", DOC_ID, TENANT)
|
||||
|
||||
def test_returns_formatted_list(self):
|
||||
db = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.fetchall.return_value = [
|
||||
("uuid-1", 1, "draft", "Initial", [], "system", None, None, datetime(2026, 3, 1)),
|
||||
("uuid-2", 2, "approved", "Updated measures", ["section3"], "admin", "dpo", datetime(2026, 3, 5), datetime(2026, 3, 2)),
|
||||
]
|
||||
db.execute.return_value = mock_result
|
||||
|
||||
result = list_versions(db, "tom", DOC_ID, TENANT)
|
||||
assert len(result) == 2
|
||||
assert result[0]["version_number"] == 1
|
||||
assert result[0]["status"] == "draft"
|
||||
assert result[1]["version_number"] == 2
|
||||
assert result[1]["approved_by"] == "dpo"
|
||||
|
||||
def test_empty_list(self):
|
||||
db = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.fetchall.return_value = []
|
||||
db.execute.return_value = mock_result
|
||||
|
||||
result = list_versions(db, "obligation", DOC_ID, TENANT)
|
||||
assert result == []
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# get_version
|
||||
# =============================================================================
|
||||
|
||||
class TestGetVersion:
|
||||
def test_invalid_doc_type_raises(self):
|
||||
db = MagicMock()
|
||||
with pytest.raises(ValueError, match="Unknown document type"):
|
||||
get_version(db, "invalid", DOC_ID, 1, TENANT)
|
||||
|
||||
def test_returns_version_with_snapshot(self):
|
||||
db = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.fetchone.return_value = (
|
||||
"uuid-1", 1, "draft", {"title": "Test", "status": "draft"},
|
||||
"Initial version", ["section1"], "system", None, None, datetime(2026, 3, 1),
|
||||
)
|
||||
db.execute.return_value = mock_result
|
||||
|
||||
result = get_version(db, "loeschfristen", DOC_ID, 1, TENANT)
|
||||
assert result is not None
|
||||
assert result["version_number"] == 1
|
||||
assert result["snapshot"]["title"] == "Test"
|
||||
assert result["change_summary"] == "Initial version"
|
||||
|
||||
def test_returns_none_for_missing(self):
|
||||
db = MagicMock()
|
||||
mock_result = MagicMock()
|
||||
mock_result.fetchone.return_value = None
|
||||
db.execute.return_value = mock_result
|
||||
|
||||
result = get_version(db, "dsfa", DOC_ID, 99, TENANT)
|
||||
assert result is None
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Route Registration Tests
|
||||
# =============================================================================
|
||||
|
||||
class TestVersionEndpointsRegistered:
|
||||
"""Verify all 5 route files have version endpoints."""
|
||||
|
||||
def _has_route(self, router, suffix):
|
||||
return any(r.path.endswith(suffix) for r in router.routes)
|
||||
|
||||
def test_dsfa_has_version_routes(self):
|
||||
from compliance.api.dsfa_routes import router
|
||||
assert self._has_route(router, "/versions")
|
||||
assert self._has_route(router, "/versions/{version_number}")
|
||||
|
||||
def test_vvt_has_version_routes(self):
|
||||
from compliance.api.vvt_routes import router
|
||||
assert self._has_route(router, "/versions")
|
||||
assert self._has_route(router, "/versions/{version_number}")
|
||||
|
||||
def test_tom_has_version_routes(self):
|
||||
from compliance.api.tom_routes import router
|
||||
assert self._has_route(router, "/versions")
|
||||
assert self._has_route(router, "/versions/{version_number}")
|
||||
|
||||
def test_loeschfristen_has_version_routes(self):
|
||||
from compliance.api.loeschfristen_routes import router
|
||||
assert self._has_route(router, "/versions")
|
||||
assert self._has_route(router, "/versions/{version_number}")
|
||||
|
||||
def test_obligation_has_version_routes(self):
|
||||
from compliance.api.obligation_routes import router
|
||||
assert self._has_route(router, "/versions")
|
||||
assert self._has_route(router, "/versions/{version_number}")
|
||||
233
backend-compliance/tests/test_generation_routes.py
Normal file
233
backend-compliance/tests/test_generation_routes.py
Normal file
@@ -0,0 +1,233 @@
|
||||
"""Tests for Document Generation (Phase 5).
|
||||
|
||||
Verifies:
|
||||
- Template generators produce correct output from context
|
||||
- DSFA template includes AI risk section
|
||||
- VVT generates one entry per processing system
|
||||
- TOM includes regulatory-specific measures
|
||||
- Loeschfristen maps standard periods
|
||||
- Obligation includes DSGVO + AI Act + NIS2 when flags set
|
||||
"""
|
||||
|
||||
import pytest
|
||||
|
||||
from compliance.api.document_templates.dsfa_template import generate_dsfa_draft
|
||||
from compliance.api.document_templates.vvt_template import generate_vvt_drafts
|
||||
from compliance.api.document_templates.loeschfristen_template import generate_loeschfristen_drafts
|
||||
from compliance.api.document_templates.tom_template import generate_tom_drafts
|
||||
from compliance.api.document_templates.obligation_template import generate_obligation_drafts
|
||||
|
||||
|
||||
def _make_ctx(**overrides):
|
||||
"""Build a realistic template context."""
|
||||
base = {
|
||||
"company_name": "Acme GmbH",
|
||||
"legal_form": "GmbH",
|
||||
"industry": "IT",
|
||||
"dpo_name": "Max Datenschutz",
|
||||
"dpo_email": "dpo@acme.de",
|
||||
"supervisory_authority": "LfDI BW",
|
||||
"review_cycle_months": 12,
|
||||
"subject_to_nis2": False,
|
||||
"subject_to_ai_act": False,
|
||||
"subject_to_iso27001": False,
|
||||
"uses_ai": False,
|
||||
"has_ai_systems": False,
|
||||
"processing_systems": [],
|
||||
"ai_systems": [],
|
||||
"ai_use_cases": [],
|
||||
"repos": [],
|
||||
"document_sources": [],
|
||||
"technical_contacts": [],
|
||||
}
|
||||
base.update(overrides)
|
||||
return base
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# DSFA Template
|
||||
# =============================================================================
|
||||
|
||||
class TestDSFATemplate:
|
||||
def test_basic_draft(self):
|
||||
ctx = _make_ctx()
|
||||
draft = generate_dsfa_draft(ctx)
|
||||
assert "DSFA" in draft["title"]
|
||||
assert draft["status"] == "draft"
|
||||
assert draft["involves_ai"] is False
|
||||
assert draft["risk_level"] == "medium"
|
||||
|
||||
def test_ai_draft_is_high_risk(self):
|
||||
ctx = _make_ctx(
|
||||
has_ai_systems=True,
|
||||
ai_systems=[{"name": "Chatbot", "purpose": "Support", "risk_category": "limited", "has_human_oversight": True}],
|
||||
subject_to_ai_act=True,
|
||||
)
|
||||
draft = generate_dsfa_draft(ctx)
|
||||
assert draft["involves_ai"] is True
|
||||
assert draft["risk_level"] == "high"
|
||||
assert len(draft["ai_systems_summary"]) == 1
|
||||
|
||||
def test_includes_dpo(self):
|
||||
ctx = _make_ctx(dpo_name="Dr. Privacy")
|
||||
draft = generate_dsfa_draft(ctx)
|
||||
assert draft["dpo_name"] == "Dr. Privacy"
|
||||
assert "Dr. Privacy" in draft["sections"]["section_1"]["content"]
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# VVT Template
|
||||
# =============================================================================
|
||||
|
||||
class TestVVTTemplate:
|
||||
def test_generates_per_system(self):
|
||||
ctx = _make_ctx(processing_systems=[
|
||||
{"name": "SAP HR", "vendor": "SAP", "hosting": "cloud", "personal_data_categories": ["Mitarbeiter"]},
|
||||
{"name": "Salesforce", "vendor": "Salesforce", "hosting": "us-cloud", "personal_data_categories": ["Kunden"]},
|
||||
])
|
||||
drafts = generate_vvt_drafts(ctx)
|
||||
assert len(drafts) == 2
|
||||
assert drafts[0]["vvt_id"] == "VVT-AUTO-001"
|
||||
assert "SAP HR" in drafts[0]["name"]
|
||||
|
||||
def test_us_cloud_adds_third_country(self):
|
||||
ctx = _make_ctx(processing_systems=[
|
||||
{"name": "AWS", "vendor": "Amazon", "hosting": "us-cloud", "personal_data_categories": []},
|
||||
])
|
||||
drafts = generate_vvt_drafts(ctx)
|
||||
assert len(drafts[0]["third_country_transfers"]) > 0
|
||||
|
||||
def test_no_systems_no_drafts(self):
|
||||
ctx = _make_ctx(processing_systems=[])
|
||||
drafts = generate_vvt_drafts(ctx)
|
||||
assert len(drafts) == 0
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# TOM Template
|
||||
# =============================================================================
|
||||
|
||||
class TestTOMTemplate:
|
||||
def test_base_toms(self):
|
||||
ctx = _make_ctx()
|
||||
drafts = generate_tom_drafts(ctx)
|
||||
assert len(drafts) == 8 # Base TOMs only
|
||||
categories = {d["category"] for d in drafts}
|
||||
assert "Zutrittskontrolle" in categories
|
||||
assert "Zugangskontrolle" in categories
|
||||
|
||||
def test_nis2_adds_cybersecurity(self):
|
||||
ctx = _make_ctx(subject_to_nis2=True)
|
||||
drafts = generate_tom_drafts(ctx)
|
||||
assert len(drafts) > 8
|
||||
categories = {d["category"] for d in drafts}
|
||||
assert "Cybersicherheit" in categories
|
||||
|
||||
def test_ai_act_adds_ki_compliance(self):
|
||||
ctx = _make_ctx(subject_to_ai_act=True)
|
||||
drafts = generate_tom_drafts(ctx)
|
||||
categories = {d["category"] for d in drafts}
|
||||
assert "KI-Compliance" in categories
|
||||
|
||||
def test_iso27001_adds_isms(self):
|
||||
ctx = _make_ctx(subject_to_iso27001=True)
|
||||
drafts = generate_tom_drafts(ctx)
|
||||
categories = {d["category"] for d in drafts}
|
||||
assert "ISMS" in categories
|
||||
|
||||
def test_all_flags_combined(self):
|
||||
ctx = _make_ctx(subject_to_nis2=True, subject_to_ai_act=True, subject_to_iso27001=True)
|
||||
drafts = generate_tom_drafts(ctx)
|
||||
# 8 base + 3 NIS2 + 3 ISO + 3 AI = 17
|
||||
assert len(drafts) == 17
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Loeschfristen Template
|
||||
# =============================================================================
|
||||
|
||||
class TestLoeschfristenTemplate:
|
||||
def test_generates_per_category(self):
|
||||
ctx = _make_ctx(processing_systems=[
|
||||
{"name": "HR", "personal_data_categories": ["Bankdaten", "Steuer-ID"]},
|
||||
{"name": "CRM", "personal_data_categories": ["Kundendaten"]},
|
||||
])
|
||||
drafts = generate_loeschfristen_drafts(ctx)
|
||||
assert len(drafts) == 3
|
||||
categories = {d["data_category"] for d in drafts}
|
||||
assert "Bankdaten" in categories
|
||||
assert "Steuer-ID" in categories
|
||||
assert "Kundendaten" in categories
|
||||
|
||||
def test_standard_periods_applied(self):
|
||||
ctx = _make_ctx(processing_systems=[
|
||||
{"name": "Payroll", "personal_data_categories": ["Bankdaten"]},
|
||||
])
|
||||
drafts = generate_loeschfristen_drafts(ctx)
|
||||
bankdaten = [d for d in drafts if d["data_category"] == "Bankdaten"][0]
|
||||
assert "10 Jahre" in bankdaten["retention_period"]
|
||||
assert "HGB" in bankdaten["legal_basis"]
|
||||
|
||||
def test_unknown_category_defaults(self):
|
||||
ctx = _make_ctx(processing_systems=[
|
||||
{"name": "Custom", "personal_data_categories": ["Spezialdaten"]},
|
||||
])
|
||||
drafts = generate_loeschfristen_drafts(ctx)
|
||||
assert drafts[0]["retention_period"] == "Noch festzulegen"
|
||||
|
||||
def test_deduplicates_categories(self):
|
||||
ctx = _make_ctx(processing_systems=[
|
||||
{"name": "A", "personal_data_categories": ["Bankdaten"]},
|
||||
{"name": "B", "personal_data_categories": ["Bankdaten"]},
|
||||
])
|
||||
drafts = generate_loeschfristen_drafts(ctx)
|
||||
assert len(drafts) == 1 # Deduplicated
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Obligation Template
|
||||
# =============================================================================
|
||||
|
||||
class TestObligationTemplate:
|
||||
def test_base_dsgvo(self):
|
||||
ctx = _make_ctx()
|
||||
drafts = generate_obligation_drafts(ctx)
|
||||
assert len(drafts) == 8 # Base DSGVO
|
||||
titles = {d["title"] for d in drafts}
|
||||
assert "Verzeichnis der Verarbeitungstätigkeiten führen" in titles
|
||||
|
||||
def test_ai_act_obligations(self):
|
||||
ctx = _make_ctx(subject_to_ai_act=True)
|
||||
drafts = generate_obligation_drafts(ctx)
|
||||
assert len(drafts) > 8
|
||||
regs = {d["regulation"] for d in drafts}
|
||||
assert "EU AI Act" in regs
|
||||
|
||||
def test_nis2_obligations(self):
|
||||
ctx = _make_ctx(subject_to_nis2=True)
|
||||
drafts = generate_obligation_drafts(ctx)
|
||||
regs = {d["regulation"] for d in drafts}
|
||||
assert "NIS2" in regs
|
||||
|
||||
def test_all_flags(self):
|
||||
ctx = _make_ctx(subject_to_nis2=True, subject_to_ai_act=True)
|
||||
drafts = generate_obligation_drafts(ctx)
|
||||
# 8 DSGVO + 3 AI + 2 NIS2 = 13
|
||||
assert len(drafts) == 13
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# Route Registration
|
||||
# =============================================================================
|
||||
|
||||
class TestGenerationRouteRegistration:
|
||||
def test_routes_registered(self):
|
||||
from compliance.api import router
|
||||
paths = [r.path for r in router.routes]
|
||||
assert any("generation" in p for p in paths)
|
||||
|
||||
def test_preview_and_apply(self):
|
||||
from compliance.api.generation_routes import router
|
||||
paths = [r.path for r in router.routes]
|
||||
assert any("preview" in p for p in paths)
|
||||
assert any("apply" in p for p in paths)
|
||||
@@ -221,6 +221,7 @@ class TestLogAudit:
|
||||
act_id = uuid.uuid4()
|
||||
_log_audit(
|
||||
db=mock_db,
|
||||
tenant_id="9282a473-5c95-4b3a-bf78-0ecc0ec71d3e",
|
||||
action="CREATE",
|
||||
entity_type="activity",
|
||||
entity_id=act_id,
|
||||
@@ -235,7 +236,7 @@ class TestLogAudit:
|
||||
|
||||
def test_defaults_changed_by(self):
|
||||
mock_db = MagicMock()
|
||||
_log_audit(mock_db, "DELETE", "activity")
|
||||
_log_audit(mock_db, tenant_id="9282a473-5c95-4b3a-bf78-0ecc0ec71d3e", action="DELETE", entity_type="activity")
|
||||
added = mock_db.add.call_args[0][0]
|
||||
assert added.changed_by == "system"
|
||||
|
||||
|
||||
205
backend-compliance/tests/test_vvt_tenant_isolation.py
Normal file
205
backend-compliance/tests/test_vvt_tenant_isolation.py
Normal file
@@ -0,0 +1,205 @@
|
||||
"""Tests for VVT tenant isolation (Phase 1: Multi-Tenancy Fix).
|
||||
|
||||
Verifies that:
|
||||
- tenant_utils correctly validates and resolves tenant IDs
|
||||
- VVT routes filter data by tenant_id
|
||||
- One tenant cannot see another tenant's data
|
||||
- "default" tenant_id is rejected
|
||||
"""
|
||||
|
||||
import pytest
|
||||
import uuid
|
||||
from unittest.mock import MagicMock, AsyncMock, patch
|
||||
from datetime import datetime
|
||||
|
||||
from fastapi import HTTPException
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from compliance.api.tenant_utils import get_tenant_id, _validate_tenant_id
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# tenant_utils unit tests
|
||||
# =============================================================================
|
||||
|
||||
TENANT_A = "9282a473-5c95-4b3a-bf78-0ecc0ec71d3e"
|
||||
TENANT_B = "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee"
|
||||
|
||||
|
||||
class TestValidateTenantId:
|
||||
def test_valid_uuid(self):
|
||||
assert _validate_tenant_id(TENANT_A) == TENANT_A
|
||||
|
||||
def test_valid_uuid_uppercase(self):
|
||||
upper = TENANT_A.upper()
|
||||
assert _validate_tenant_id(upper) == upper
|
||||
|
||||
def test_reject_default_string(self):
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
_validate_tenant_id("default")
|
||||
assert exc_info.value.status_code == 400
|
||||
assert "default" in str(exc_info.value.detail)
|
||||
|
||||
def test_reject_empty_string(self):
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
_validate_tenant_id("")
|
||||
assert exc_info.value.status_code == 400
|
||||
|
||||
def test_reject_random_string(self):
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
_validate_tenant_id("my-tenant")
|
||||
assert exc_info.value.status_code == 400
|
||||
|
||||
def test_reject_partial_uuid(self):
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
_validate_tenant_id("9282a473-5c95-4b3a")
|
||||
assert exc_info.value.status_code == 400
|
||||
|
||||
|
||||
class TestGetTenantId:
|
||||
@pytest.mark.asyncio
|
||||
async def test_header_takes_precedence(self):
|
||||
result = await get_tenant_id(x_tenant_id=TENANT_A, tenant_id=TENANT_B)
|
||||
assert result == TENANT_A
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_query_param_fallback(self):
|
||||
result = await get_tenant_id(x_tenant_id=None, tenant_id=TENANT_B)
|
||||
assert result == TENANT_B
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_env_fallback(self):
|
||||
result = await get_tenant_id(x_tenant_id=None, tenant_id=None)
|
||||
# Falls back to ENV default which is the well-known dev UUID
|
||||
assert result == TENANT_A
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_reject_default_via_header(self):
|
||||
with pytest.raises(HTTPException):
|
||||
await get_tenant_id(x_tenant_id="default", tenant_id=None)
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# VVT Model tests — tenant_id column present
|
||||
# =============================================================================
|
||||
|
||||
class TestVVTModelsHaveTenantId:
|
||||
def test_organization_has_tenant_id(self):
|
||||
from compliance.db.vvt_models import VVTOrganizationDB
|
||||
assert hasattr(VVTOrganizationDB, 'tenant_id')
|
||||
col = VVTOrganizationDB.__table__.columns['tenant_id']
|
||||
assert col.nullable is False
|
||||
|
||||
def test_activity_has_tenant_id(self):
|
||||
from compliance.db.vvt_models import VVTActivityDB
|
||||
assert hasattr(VVTActivityDB, 'tenant_id')
|
||||
col = VVTActivityDB.__table__.columns['tenant_id']
|
||||
assert col.nullable is False
|
||||
|
||||
def test_audit_log_has_tenant_id(self):
|
||||
from compliance.db.vvt_models import VVTAuditLogDB
|
||||
assert hasattr(VVTAuditLogDB, 'tenant_id')
|
||||
col = VVTAuditLogDB.__table__.columns['tenant_id']
|
||||
assert col.nullable is False
|
||||
|
||||
def test_activity_no_global_unique_vvt_id(self):
|
||||
"""vvt_id should NOT have a global unique constraint anymore."""
|
||||
from compliance.db.vvt_models import VVTActivityDB
|
||||
col = VVTActivityDB.__table__.columns['vvt_id']
|
||||
assert col.unique is not True # unique moved to composite constraint
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# VVT Route integration tests — tenant isolation via mocked DB
|
||||
# =============================================================================
|
||||
|
||||
def _make_activity(tenant_id, vvt_id="VVT-001", name="Test", **kwargs):
|
||||
"""Create a mock VVTActivityDB."""
|
||||
act = MagicMock()
|
||||
act.id = uuid.uuid4()
|
||||
act.tenant_id = tenant_id
|
||||
act.vvt_id = vvt_id
|
||||
act.name = name
|
||||
act.description = kwargs.get("description", "")
|
||||
act.purposes = kwargs.get("purposes", [])
|
||||
act.legal_bases = kwargs.get("legal_bases", [])
|
||||
act.data_subject_categories = []
|
||||
act.personal_data_categories = []
|
||||
act.recipient_categories = []
|
||||
act.third_country_transfers = []
|
||||
act.retention_period = {}
|
||||
act.tom_description = None
|
||||
act.business_function = kwargs.get("business_function", "IT")
|
||||
act.systems = []
|
||||
act.deployment_model = None
|
||||
act.data_sources = []
|
||||
act.data_flows = []
|
||||
act.protection_level = "MEDIUM"
|
||||
act.dpia_required = False
|
||||
act.structured_toms = {}
|
||||
act.status = kwargs.get("status", "DRAFT")
|
||||
act.responsible = None
|
||||
act.owner = None
|
||||
act.last_reviewed_at = None
|
||||
act.next_review_at = None
|
||||
act.created_by = "system"
|
||||
act.dsfa_id = None
|
||||
act.created_at = datetime.utcnow()
|
||||
act.updated_at = datetime.utcnow()
|
||||
return act
|
||||
|
||||
|
||||
class TestVVTRouteTenantIsolation:
|
||||
"""Verify that _activity_to_response and _log_audit accept tenant_id."""
|
||||
|
||||
def test_activity_to_response(self):
|
||||
from compliance.api.vvt_routes import _activity_to_response
|
||||
act = _make_activity(TENANT_A, "VVT-100", "Test Activity")
|
||||
resp = _activity_to_response(act)
|
||||
assert resp.vvt_id == "VVT-100"
|
||||
assert resp.name == "Test Activity"
|
||||
|
||||
def test_log_audit_with_tenant(self):
|
||||
from compliance.api.vvt_routes import _log_audit
|
||||
db = MagicMock()
|
||||
_log_audit(db, tenant_id=TENANT_A, action="CREATE", entity_type="activity")
|
||||
db.add.assert_called_once()
|
||||
entry = db.add.call_args[0][0]
|
||||
assert entry.tenant_id == TENANT_A
|
||||
assert entry.action == "CREATE"
|
||||
|
||||
def test_log_audit_different_tenants(self):
|
||||
from compliance.api.vvt_routes import _log_audit
|
||||
db = MagicMock()
|
||||
_log_audit(db, tenant_id=TENANT_A, action="CREATE", entity_type="activity")
|
||||
_log_audit(db, tenant_id=TENANT_B, action="UPDATE", entity_type="activity")
|
||||
assert db.add.call_count == 2
|
||||
entries = [call[0][0] for call in db.add.call_args_list]
|
||||
assert entries[0].tenant_id == TENANT_A
|
||||
assert entries[1].tenant_id == TENANT_B
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# DSFA / Vendor — DEFAULT_TENANT_ID no longer "default"
|
||||
# =============================================================================
|
||||
|
||||
class TestDSFADefaultTenantFixed:
|
||||
def test_dsfa_default_is_uuid(self):
|
||||
from compliance.api.dsfa_routes import DEFAULT_TENANT_ID
|
||||
assert DEFAULT_TENANT_ID != "default"
|
||||
assert len(DEFAULT_TENANT_ID) == 36
|
||||
assert "-" in DEFAULT_TENANT_ID
|
||||
|
||||
def test_dsfa_get_tenant_id_fallback(self):
|
||||
from compliance.api.dsfa_routes import _get_tenant_id
|
||||
result = _get_tenant_id(None)
|
||||
assert result != "default"
|
||||
assert len(result) == 36
|
||||
|
||||
|
||||
class TestVendorDefaultTenantFixed:
|
||||
def test_vendor_default_is_uuid(self):
|
||||
from compliance.api.vendor_compliance_routes import DEFAULT_TENANT_ID
|
||||
assert DEFAULT_TENANT_ID != "default"
|
||||
assert len(DEFAULT_TENANT_ID) == 36
|
||||
assert "-" in DEFAULT_TENANT_ID
|
||||
125
docs-src/services/sdk-modules/change-requests.md
Normal file
125
docs-src/services/sdk-modules/change-requests.md
Normal file
@@ -0,0 +1,125 @@
|
||||
# Change-Request System (CP-CR)
|
||||
|
||||
Das Change-Request-System ist die zentrale Inbox fuer alle Dokumentenaenderungen im Compliance SDK. Statt Dokumente direkt zu erstellen oder zu aendern, werden alle Aenderungen als Change-Requests vorgeschlagen und muessen explizit angenommen oder abgelehnt werden.
|
||||
|
||||
## Architektur
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Trigger-Event] --> B[Change-Request Engine]
|
||||
B --> C[CR-Inbox pending]
|
||||
C --> D{Entscheidung}
|
||||
D -->|Accept| E[Version erstellen]
|
||||
D -->|Edit & Accept| F[Bearbeiten + Version]
|
||||
D -->|Reject| G[Ablehnen + Begruendung]
|
||||
E --> H[Audit-Log]
|
||||
F --> H
|
||||
G --> H
|
||||
```
|
||||
|
||||
## API-Endpoints
|
||||
|
||||
| Methode | Pfad | Beschreibung |
|
||||
|---------|------|--------------|
|
||||
| `GET` | `/change-requests` | Liste (Filter: status, doc_type, priority) |
|
||||
| `GET` | `/change-requests/stats` | Statistik: pending, critical, accepted, rejected |
|
||||
| `GET` | `/change-requests/{id}` | Detail mit Audit-Log |
|
||||
| `POST` | `/change-requests` | Manuell erstellen |
|
||||
| `POST` | `/change-requests/{id}/accept` | Aenderung uebernehmen, Version erstellen |
|
||||
| `POST` | `/change-requests/{id}/reject` | Ablehnen mit Begruendung |
|
||||
| `POST` | `/change-requests/{id}/edit` | Vorschlag bearbeiten und annehmen |
|
||||
| `DELETE` | `/change-requests/{id}` | Soft-Delete |
|
||||
|
||||
## Status-Workflow
|
||||
|
||||
```
|
||||
pending --> accepted
|
||||
pending --> rejected
|
||||
pending --> edited_and_accepted
|
||||
```
|
||||
|
||||
## Prioritaeten
|
||||
|
||||
- `critical` — Sofortige Aktion erforderlich
|
||||
- `high` — Wichtig, zeitnah bearbeiten
|
||||
- `normal` — Standard-Prioritaet
|
||||
- `low` — Kann warten
|
||||
|
||||
## Trigger-Typen
|
||||
|
||||
| Trigger | Erzeugte Change-Requests |
|
||||
|---------|--------------------------|
|
||||
| `generation` | Automatisch aus Stammdaten generiert |
|
||||
| `vvt_dpia` | VVT-Aktivitaet mit dpia_required=true → DSFA CR |
|
||||
| `vvt_data_category` | Neue Datenkategorie → Loeschfristen CR |
|
||||
| `use_case_risk` | Use Case mit hohem Risiko → DSFA CR |
|
||||
| `use_case_ai` | Use Case mit KI → DSFA Sektion-Update CR |
|
||||
| `manual` | Manuell erstellt |
|
||||
|
||||
## Change-Request Engine
|
||||
|
||||
Die Engine (`change_request_engine.py`) generiert automatisch Change-Requests bei bestimmten Trigger-Events:
|
||||
|
||||
### VVT-Trigger
|
||||
```python
|
||||
generate_change_requests_for_vvt(db, tenant_id, activity, user)
|
||||
```
|
||||
- Wenn `dpia_required=true`: DSFA-CR wird erstellt
|
||||
- Fuer jede neue Datenkategorie: Loeschfristen-CR wird erstellt
|
||||
|
||||
### Use-Case-Trigger
|
||||
```python
|
||||
generate_change_requests_for_use_case(db, tenant_id, use_case, user)
|
||||
```
|
||||
- Bei `risk_level` high/critical: DSFA-CR
|
||||
- Bei KI-Involvement: DSFA-Sektions-Update-CR
|
||||
|
||||
## Frontend
|
||||
|
||||
Die Change-Request Inbox ist unter `/sdk/change-requests` erreichbar:
|
||||
- Stats-Bar mit Zahlen (pending, critical, accepted, rejected)
|
||||
- Filter nach Status und Dokumenttyp
|
||||
- Card-Liste mit Trigger-Badge, Titel, Prioritaet
|
||||
- Accept/Edit/Reject Modals
|
||||
- 60s Auto-Refresh
|
||||
|
||||
## Datenbank
|
||||
|
||||
### Migration 038
|
||||
|
||||
```sql
|
||||
CREATE TABLE compliance_change_requests (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
tenant_id VARCHAR(255) NOT NULL,
|
||||
trigger_type VARCHAR(50) NOT NULL,
|
||||
target_document_type VARCHAR(50),
|
||||
target_document_id UUID,
|
||||
target_section VARCHAR(100),
|
||||
proposal_title VARCHAR(500) NOT NULL,
|
||||
proposal_body TEXT,
|
||||
proposed_changes JSONB DEFAULT '{}',
|
||||
status VARCHAR(30) DEFAULT 'pending',
|
||||
priority VARCHAR(20) DEFAULT 'normal',
|
||||
decided_by VARCHAR(255),
|
||||
decided_at TIMESTAMPTZ,
|
||||
rejection_reason TEXT,
|
||||
created_by VARCHAR(255) DEFAULT 'system',
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
deleted_at TIMESTAMPTZ
|
||||
);
|
||||
|
||||
CREATE TABLE compliance_change_request_audit (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
change_request_id UUID REFERENCES compliance_change_requests(id),
|
||||
action VARCHAR(50) NOT NULL,
|
||||
performed_by VARCHAR(255),
|
||||
before_state JSONB,
|
||||
after_state JSONB,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
```
|
||||
|
||||
## Tests
|
||||
|
||||
- 26 Tests in `test_change_request_routes.py`
|
||||
- Schema-Validierung, Engine-Regeln, Route-Registrierung
|
||||
82
docs-src/services/sdk-modules/dokumentengenerierung.md
Normal file
82
docs-src/services/sdk-modules/dokumentengenerierung.md
Normal file
@@ -0,0 +1,82 @@
|
||||
# Dokumentengenerierung aus Stammdaten
|
||||
|
||||
Basierend auf dem Unternehmensprofil (Stammdaten) koennen Compliance-Dokumente automatisch als Entwuerfe generiert werden. Die Generierung erzeugt Change-Requests, KEINE direkten Dokumente — alles muss ueber die CR-Inbox reviewed werden.
|
||||
|
||||
## Workflow
|
||||
|
||||
```mermaid
|
||||
graph LR
|
||||
A[Stammdaten] --> B[Template-Engine]
|
||||
B --> C[Draft-Dokumente]
|
||||
C --> D[Change-Requests]
|
||||
D --> E[CR-Inbox Review]
|
||||
E --> F[Finales Dokument]
|
||||
```
|
||||
|
||||
## API-Endpoints
|
||||
|
||||
| Methode | Pfad | Beschreibung |
|
||||
|---------|------|--------------|
|
||||
| `GET` | `/generation/preview/{doc_type}` | Vorschau ohne DB-Writes |
|
||||
| `POST` | `/generation/apply/{doc_type}` | Generiert Drafts → erstellt CRs |
|
||||
|
||||
### Gueltige Dokumenttypen
|
||||
|
||||
`dsfa`, `vvt`, `tom`, `loeschfristen`, `obligation`
|
||||
|
||||
## Template-Generatoren
|
||||
|
||||
### DSFA (`dsfa_template.py`)
|
||||
- Erstellt DSFA-Skeleton basierend auf Firmenprofil
|
||||
- Wenn KI-Systeme vorhanden: `risk_level = high`, AI-Sections befuellt
|
||||
- DPO-Name und Aufsichtsbehoerde automatisch eingesetzt
|
||||
|
||||
### VVT (`vvt_template.py`)
|
||||
- Ein VVT-Eintrag pro `processing_system`
|
||||
- US-Cloud-Hosting → automatisch Drittlandtransfer-Eintrag
|
||||
- Datenkategorien und Rechtsgrundlagen vorausgefuellt
|
||||
|
||||
### TOM (`tom_template.py`)
|
||||
- 8 Basis-TOMs (DSGVO-Standard)
|
||||
- +3 bei `subject_to_nis2` (Cybersicherheit)
|
||||
- +3 bei `subject_to_ai_act` (KI-Compliance)
|
||||
- +3 bei `subject_to_iso27001` (ISMS)
|
||||
|
||||
### Loeschfristen (`loeschfristen_template.py`)
|
||||
- Eine Loeschfrist pro Datenkategorie aus processing_systems
|
||||
- 10 Standard-Perioden (z.B. Bankdaten → 10 Jahre HGB)
|
||||
- Unbekannte Kategorien → "Noch festzulegen"
|
||||
- Deduplizierung bei mehreren Systemen mit gleicher Kategorie
|
||||
|
||||
### Pflichten (`obligation_template.py`)
|
||||
- 8 Basis-DSGVO-Pflichten
|
||||
- +3 bei AI Act
|
||||
- +2 bei NIS2
|
||||
|
||||
## Stammdaten-Kontext
|
||||
|
||||
Der Template-Kontext wird aus `compliance_company_profiles` gelesen und enthaelt:
|
||||
|
||||
| Feld | Beschreibung |
|
||||
|------|--------------|
|
||||
| `company_name` | Firmenname |
|
||||
| `dpo_name`, `dpo_email` | Datenschutzbeauftragter |
|
||||
| `supervisory_authority` | Aufsichtsbehoerde |
|
||||
| `processing_systems` | IT-Systeme mit pbD |
|
||||
| `ai_systems` | KI-System-Katalog |
|
||||
| `subject_to_nis2/ai_act/iso27001` | Regulierungs-Flags |
|
||||
| `review_cycle_months` | Pruefzyklus |
|
||||
|
||||
## Frontend
|
||||
|
||||
Im Company-Profile-Wizard erscheint nach Abschluss (`is_complete = true`) ein CTA-Panel "Dokumente generieren":
|
||||
- Pro Dokumenttyp ein "Generieren"-Button
|
||||
- Ergebnis: Anzahl erstellter Change-Requests
|
||||
- Link zur CR-Inbox
|
||||
|
||||
## Tests
|
||||
|
||||
- 21 Tests in `test_generation_routes.py`
|
||||
- Alle 5 Template-Generatoren mit verschiedenen Kontext-Variationen
|
||||
- Regulierungs-Flag-Kombinationen
|
||||
- Route-Registrierung
|
||||
60
docs-src/services/sdk-modules/multi-tenancy.md
Normal file
60
docs-src/services/sdk-modules/multi-tenancy.md
Normal file
@@ -0,0 +1,60 @@
|
||||
# Multi-Tenancy
|
||||
|
||||
Das Compliance SDK ist vollstaendig mandantenfaehig. Jeder Tenant hat isolierte Daten — kein Tenant kann auf Daten eines anderen zugreifen.
|
||||
|
||||
## Tenant-ID
|
||||
|
||||
- Format: UUID v4 (z.B. `9282a473-5c95-4b3a-bf78-0ecc0ec71d3e`)
|
||||
- Der String `"default"` wird NICHT mehr akzeptiert
|
||||
- Validierung via Regex in `tenant_utils.py`
|
||||
|
||||
## Shared Tenant Middleware
|
||||
|
||||
`compliance/api/tenant_utils.py` stellt eine zentrale FastAPI-Dependency:
|
||||
|
||||
```python
|
||||
from compliance.api.tenant_utils import get_tenant_id
|
||||
|
||||
@router.get("/example")
|
||||
async def example(tid: str = Depends(get_tenant_id)):
|
||||
# tid ist validierte UUID
|
||||
...
|
||||
```
|
||||
|
||||
### Resolution-Reihenfolge
|
||||
|
||||
1. `X-Tenant-ID` HTTP-Header
|
||||
2. `tenant_id` Query-Parameter
|
||||
3. `DEFAULT_TENANT_ID` Umgebungsvariable
|
||||
4. Fallback: `9282a473-5c95-4b3a-bf78-0ecc0ec71d3e`
|
||||
|
||||
## Migration 035: VVT Tenant Isolation
|
||||
|
||||
- `compliance_vvt_activities`, `compliance_vvt_organization`, `compliance_vvt_audit_log`: `tenant_id VARCHAR(255) NOT NULL` hinzugefuegt
|
||||
- Bestehende Daten auf Standard-UUID backfilled
|
||||
- UNIQUE-Constraint `(tenant_id, vvt_id)` statt globalem `vvt_id`
|
||||
- DSFA/Vendor: `"default"` → UUID migriert
|
||||
|
||||
## Betroffene Module
|
||||
|
||||
| Modul | Aenderung |
|
||||
|-------|-----------|
|
||||
| VVT | `tenant_id` Column + Query-Filter auf alle Endpoints |
|
||||
| DSFA | `DEFAULT_TENANT_ID` von `"default"` auf UUID |
|
||||
| Vendor Compliance | `DEFAULT_TENANT_ID` von `"default"` auf UUID |
|
||||
| Change-Requests | Nativ mit `tenant_id` |
|
||||
| Versionierung | Nativ mit `tenant_id` |
|
||||
| Company Profile | Nativ mit `tenant_id` |
|
||||
|
||||
## Frontend-Proxy
|
||||
|
||||
Der Next.js-Proxy setzt automatisch den `X-Tenant-ID` Header:
|
||||
|
||||
```typescript
|
||||
headers['X-Tenant-ID'] = clientTenantId || process.env.DEFAULT_TENANT_ID || '9282a473-...'
|
||||
```
|
||||
|
||||
## Tests
|
||||
|
||||
- 20 Tests in `test_vvt_tenant_isolation.py`
|
||||
- UUID-Validierung, Header-/Query-Precedence, Model-Column-Checks, Route-Isolation
|
||||
90
docs-src/services/sdk-modules/stammdaten.md
Normal file
90
docs-src/services/sdk-modules/stammdaten.md
Normal file
@@ -0,0 +1,90 @@
|
||||
# Stammdaten / Company Profile
|
||||
|
||||
Das Unternehmensprofil (Stammdaten) bildet die Grundlage fuer die automatische Dokumentengenerierung. Es wird in einem mehrstufigen Wizard erfasst.
|
||||
|
||||
## Wizard-Schritte
|
||||
|
||||
| Schritt | Name | Felder |
|
||||
|---------|------|--------|
|
||||
| 1 | Basisinfos | Firmenname, Rechtsform, Branche, Gruendungsjahr |
|
||||
| 2 | Geschaeftsmodell | B2B/B2C/B2B2C, Angebote |
|
||||
| 3 | Firmengroesse | Unternehmengroesse, Mitarbeiterzahl, Umsatz |
|
||||
| 4 | Standorte | Hauptsitz, Zielmaerkte |
|
||||
| 5 | Datenschutz | Datenschutz-Rolle, KI-Nutzung, DSB |
|
||||
| 6 | Systeme & KI | IT-Systeme (pbD), KI-System-Katalog |
|
||||
| 7 | Rechtlicher Rahmen | NIS2/AI Act/ISO 27001, Aufsichtsbehoerde, Pruefzyklus, Ansprechpartner |
|
||||
| 8 | Produkt & Maschine | Nur fuer Maschinenbauer: CE, Safety, Software |
|
||||
|
||||
## API-Endpoints
|
||||
|
||||
| Methode | Pfad | Beschreibung |
|
||||
|---------|------|--------------|
|
||||
| `GET` | `/company-profile` | Profil laden |
|
||||
| `POST` | `/company-profile` | Profil erstellen/aktualisieren (Upsert) |
|
||||
| `DELETE` | `/company-profile` | Profil loeschen (DSGVO Art. 17) |
|
||||
| `GET` | `/company-profile/template-context` | Flacher Dict fuer Template-Substitution |
|
||||
|
||||
## Erweiterte Felder (Phase 2)
|
||||
|
||||
### Migration 036
|
||||
|
||||
| Spalte | Typ | Beschreibung |
|
||||
|--------|-----|--------------|
|
||||
| `repos` | JSONB | Git-Repos im Scope |
|
||||
| `document_sources` | JSONB | Bestehende Compliance-Dokumente |
|
||||
| `processing_systems` | JSONB | IT-Systeme mit personenbezogenen Daten |
|
||||
| `ai_systems` | JSONB | Strukturierter KI-System-Katalog |
|
||||
| `technical_contacts` | JSONB | CISO, IT-Manager etc. |
|
||||
| `subject_to_nis2` | BOOLEAN | NIS2-Pflicht |
|
||||
| `subject_to_ai_act` | BOOLEAN | AI Act relevant |
|
||||
| `subject_to_iso27001` | BOOLEAN | ISO 27001 Zertifizierung |
|
||||
| `supervisory_authority` | VARCHAR(255) | Zustaendige Aufsichtsbehoerde |
|
||||
| `review_cycle_months` | INTEGER | Pruefzyklus in Monaten |
|
||||
|
||||
### Processing Systems Format
|
||||
|
||||
```json
|
||||
[
|
||||
{
|
||||
"name": "SAP HR",
|
||||
"vendor": "SAP",
|
||||
"hosting": "cloud",
|
||||
"personal_data_categories": ["Mitarbeiter", "Bankdaten"]
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### AI Systems Format
|
||||
|
||||
```json
|
||||
[
|
||||
{
|
||||
"name": "Chatbot",
|
||||
"purpose": "Kundensupport",
|
||||
"risk_category": "limited",
|
||||
"vendor": "OpenAI",
|
||||
"has_human_oversight": true
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
## Template-Kontext
|
||||
|
||||
Der Endpoint `GET /company-profile/template-context` liefert einen flachen Dict, der direkt in die Template-Generierung einfliessen kann:
|
||||
|
||||
```json
|
||||
{
|
||||
"company_name": "Acme GmbH",
|
||||
"processing_systems": [...],
|
||||
"ai_systems": [...],
|
||||
"subject_to_nis2": false,
|
||||
"subject_to_ai_act": true,
|
||||
"has_ai_systems": true,
|
||||
"is_complete": true
|
||||
}
|
||||
```
|
||||
|
||||
## Tests
|
||||
|
||||
- 10 Tests in `test_company_profile_extend.py`
|
||||
- Pydantic-Schema-Validierung, Row-to-Response-Mapping, Template-Kontext
|
||||
80
docs-src/services/sdk-modules/versionierung.md
Normal file
80
docs-src/services/sdk-modules/versionierung.md
Normal file
@@ -0,0 +1,80 @@
|
||||
# Dokument-Versionierung
|
||||
|
||||
Alle Compliance-Dokumente (DSFA, VVT, TOM, Loeschfristen, Pflichten) werden automatisch versioniert. Jede Aenderung erzeugt eine neue Version mit Snapshot, Change-Summary und geaenderten Sektionen.
|
||||
|
||||
## Architektur
|
||||
|
||||
Das Versionierungs-Pattern folgt dem bestehenden `compliance_legal_document_versions`-Ansatz:
|
||||
- Separate Versions-Tabelle pro Dokumenttyp
|
||||
- Snapshot als JSONB (vollstaendiger Dokumentstand)
|
||||
- Status-Workflow: draft → approved
|
||||
- `current_version` Column auf der Haupt-Tabelle
|
||||
|
||||
## Unterstuetzte Dokumenttypen
|
||||
|
||||
| Typ | Versions-Tabelle | FK-Column |
|
||||
|-----|-----------------|-----------|
|
||||
| DSFA | `compliance_dsfa_versions` | `dsfa_id` |
|
||||
| VVT | `compliance_vvt_activity_versions` | `activity_id` |
|
||||
| TOM | `compliance_tom_versions` | `measure_id` |
|
||||
| Loeschfristen | `compliance_loeschfristen_versions` | `policy_id` |
|
||||
| Pflichten | `compliance_obligation_versions` | `obligation_id` |
|
||||
|
||||
## API-Endpoints
|
||||
|
||||
Fuer jeden Dokumenttyp gibt es zwei Versioning-Endpoints:
|
||||
|
||||
| Methode | Pfad | Beschreibung |
|
||||
|---------|------|--------------|
|
||||
| `GET` | `/{id}/versions` | Alle Versionen (neueste zuerst) |
|
||||
| `GET` | `/{id}/versions/{v}` | Spezifische Version mit Snapshot |
|
||||
|
||||
### Beispiele
|
||||
|
||||
```
|
||||
GET /api/compliance/dsfa/{dsfa_id}/versions
|
||||
GET /api/compliance/dsfa/{dsfa_id}/versions/3
|
||||
GET /api/compliance/activities/{id}/versions
|
||||
GET /api/compliance/measures/{id}/versions
|
||||
GET /api/compliance/loeschfristen/{id}/versions
|
||||
GET /api/compliance/obligations/{id}/versions
|
||||
```
|
||||
|
||||
## Shared Helper
|
||||
|
||||
`versioning_utils.py` stellt drei Funktionen bereit:
|
||||
|
||||
```python
|
||||
create_version_snapshot(db, doc_type, doc_id, tenant_id, snapshot, change_summary, changed_sections, created_by)
|
||||
list_versions(db, doc_type, doc_id, tenant_id)
|
||||
get_version(db, doc_type, doc_id, tenant_id, version_number)
|
||||
```
|
||||
|
||||
## Frontend-Komponente
|
||||
|
||||
Die `VersionHistory` React-Komponente zeigt eine Timeline aller Versionen:
|
||||
|
||||
```tsx
|
||||
<VersionHistory
|
||||
documentType="dsfa"
|
||||
documentId={dsfa.id}
|
||||
apiPath={`dsfa/${dsfa.id}/versions`}
|
||||
/>
|
||||
```
|
||||
|
||||
Features:
|
||||
- Expandierbare Timeline mit Version-Dots
|
||||
- Status-Badges (approved/draft)
|
||||
- Datum und geaenderte Sektionen
|
||||
- Change-Summary
|
||||
|
||||
## Datenbank
|
||||
|
||||
### Migration 037
|
||||
|
||||
Erstellt 5 Versions-Tabellen und fuegt `current_version INTEGER DEFAULT 0` zu allen Quell-Tabellen hinzu.
|
||||
|
||||
## Tests
|
||||
|
||||
- 20 Tests in `test_document_versions.py`
|
||||
- VERSION_TABLES Mapping, create/list/get Version, Route-Registrierung
|
||||
@@ -93,6 +93,11 @@ nav:
|
||||
- Obligations v2 (CP-OBL): services/sdk-modules/obligations.md
|
||||
- Training Engine (CP-TRAIN): services/sdk-modules/training.md
|
||||
- SDK Workflow & Seq-Nummern: services/sdk-modules/sdk-workflow.md
|
||||
- Multi-Tenancy: services/sdk-modules/multi-tenancy.md
|
||||
- Stammdaten / Company Profile: services/sdk-modules/stammdaten.md
|
||||
- Dokument-Versionierung: services/sdk-modules/versionierung.md
|
||||
- Change-Request System (CP-CR): services/sdk-modules/change-requests.md
|
||||
- Dokumentengenerierung: services/sdk-modules/dokumentengenerierung.md
|
||||
- Strategie:
|
||||
- Wettbewerbsanalyse & Roadmap: strategy/wettbewerbsanalyse.md
|
||||
- Entwicklung:
|
||||
|
||||
Reference in New Issue
Block a user