Compare commits

...

4 Commits

Author SHA1 Message Date
Benjamin Admin
b4d2be83eb Merge gitea/main: resolve ci.yaml conflict, keep Coolify deploy
All checks were successful
CI/CD / go-lint (push) Has been skipped
CI/CD / python-lint (push) Has been skipped
CI/CD / nodejs-lint (push) Has been skipped
CI/CD / test-go-ai-compliance (push) Successful in 40s
CI/CD / test-python-backend-compliance (push) Successful in 39s
CI/CD / test-python-document-crawler (push) Successful in 30s
CI/CD / test-python-dsms-gateway (push) Successful in 24s
CI/CD / validate-canonical-controls (push) Successful in 15s
CI/CD / Deploy (push) Successful in 3s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-13 13:26:17 +01:00
Benjamin Admin
38c7cf0a00 Merge branch 'main' of ssh://gitea.meghsakha.com:22222/Benjamin_Boenisch/breakpilot-compliance 2026-03-13 13:23:30 +01:00
Benjamin Admin
399fa62267 docs: update all docs to reflect Coolify deployment model
Replace Hetzner references with Coolify. Deployment is now:
- Core + Compliance: Push gitea → Coolify auto-deploys
- Lehrer: stays local on Mac Mini

Updated: CLAUDE.md, MkDocs CI/CD pipeline, MkDocs index.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-13 12:09:51 +01:00
f1710fdb9e fix: migrate deployment from Hetzner to Coolify (#1)
All checks were successful
CI/CD / go-lint (push) Has been skipped
CI/CD / python-lint (push) Has been skipped
CI/CD / nodejs-lint (push) Has been skipped
CI/CD / test-go-ai-compliance (push) Successful in 34s
CI/CD / test-python-backend-compliance (push) Successful in 39s
CI/CD / test-python-document-crawler (push) Successful in 24s
CI/CD / test-python-dsms-gateway (push) Successful in 19s
CI/CD / validate-canonical-controls (push) Successful in 13s
CI/CD / Deploy (push) Successful in 2s
## Summary
- Add Coolify deployment configuration (docker-compose, healthchecks, network setup)
- Replace deploy-hetzner CI job with Coolify webhook deploy
- Externalize postgres, qdrant, S3 for Coolify environment

## All changes since branch creation
- Coolify docker-compose with Traefik labels and healthchecks
- CI pipeline: deploy-hetzner → deploy-coolify (simple webhook curl)
- SQLAlchemy 2.x text() compatibility fixes
- Alpine-compatible Dockerfile fixes

Co-authored-by: Sharang Parnerkar <parnerkarsharang@gmail.com>
Reviewed-on: #1
2026-03-13 10:45:35 +00:00
16 changed files with 543 additions and 664 deletions

View File

@@ -2,53 +2,49 @@
## Entwicklungsumgebung (WICHTIG - IMMER ZUERST LESEN) ## Entwicklungsumgebung (WICHTIG - IMMER ZUERST LESEN)
### Zwei-Rechner-Setup + Hetzner ### Zwei-Rechner-Setup + Coolify
| Geraet | Rolle | Aufgaben | | Geraet | Rolle | Aufgaben |
|--------|-------|----------| |--------|-------|----------|
| **MacBook** | Entwicklung | Claude Terminal, Code-Entwicklung, Browser (Frontend-Tests) | | **MacBook** | Entwicklung | Claude Terminal, Code-Entwicklung, Browser (Frontend-Tests) |
| **Mac Mini** | Lokaler Server | Docker fuer lokale Dev/Tests (NICHT mehr fuer Production!) | | **Mac Mini** | Lokaler Server | Docker fuer lokale Dev/Tests (NICHT fuer Production!) |
| **Hetzner** | Production | CI/CD Build + Deploy via Gitea Actions | | **Coolify** | Production | Automatisches Build + Deploy bei Push auf gitea |
**WICHTIG:** Code wird auf dem MacBook bearbeitet. Production-Deployment laeuft automatisch auf Hetzner via CI/CD. **WICHTIG:** Code wird auf dem MacBook bearbeitet. Production-Deployment laeuft automatisch ueber Coolify.
### Entwicklungsworkflow (CI/CD — seit 2026-03-11) ### Entwicklungsworkflow (CI/CD — Coolify)
```bash ```bash
# 1. Code auf MacBook bearbeiten (dieses Verzeichnis) # 1. Code auf MacBook bearbeiten (dieses Verzeichnis)
# 2. Committen und zu BEIDEN Remotes pushen: # 2. Committen und zu BEIDEN Remotes pushen:
git push origin main && git push gitea main git push origin main && git push gitea main
# 3. FERTIG! Gitea Actions auf Hetzner uebernimmt automatisch: # 3. FERTIG! Push auf gitea triggert automatisch:
# Push auf main → Lint → Tests → Build → Deploy # - Gitea Actions: Lint → Tests → Validierung
# Pipeline: .gitea/workflows/ci.yaml # - Coolify: Build → Deploy
# Dauer: ca. 3 Minuten # Dauer: ca. 3 Minuten
# Status pruefen: https://gitea.meghsakha.com/Benjamin_Boenisch/breakpilot-compliance/actions # Status pruefen: https://gitea.meghsakha.com/Benjamin_Boenisch/breakpilot-compliance/actions
``` ```
**NICHT MEHR NOETIG:** Manuelles `ssh macmini "docker compose build"` — das macht jetzt die CI/CD Pipeline! **NICHT MEHR NOETIG:** Manuelles `ssh macmini "docker compose build"` fuer Production.
**NIEMALS** manuell in Coolify auf "Redeploy" klicken — Gitea Actions triggert Coolify automatisch.
### CI/CD Pipeline (Gitea Actions → Hetzner) ### CI/CD Pipeline (Gitea Actions → Coolify)
``` ```
Push auf main → go-lint/python-lint/nodejs-lint (nur PRs) Push auf gitea main → go-lint/python-lint/nodejs-lint (nur PRs)
→ test-go-ai-compliance → test-go-ai-compliance
→ test-python-backend-compliance → test-python-backend-compliance
→ test-python-document-crawler → test-python-document-crawler
→ test-python-dsms-gateway → test-python-dsms-gateway
→ deploy-hetzner (nur wenn ALLE Tests gruen) → validate-canonical-controls
→ Coolify: Build + Deploy (automatisch bei Push)
``` ```
**Dateien:** **Dateien:**
- `.gitea/workflows/ci.yaml` — Pipeline-Definition - `.gitea/workflows/ci.yaml` — Pipeline-Definition (Tests + Validierung)
- `docker-compose.hetzner.yml` — Override: arm64→amd64 fuer Hetzner (x86_64) - `docker-compose.yml` — Haupt-Compose
- Deploy-Pfad auf Hetzner: `/opt/breakpilot-compliance/` - `docker-compose.hetzner.yml` — Override: arm64→amd64 fuer Coolify Production (x86_64)
**Ablauf deploy-hetzner:**
1. `git pull` im Deploy-Dir
2. `docker compose -f docker-compose.yml -f docker-compose.hetzner.yml build --parallel`
3. `docker compose up -d --remove-orphans`
4. Health Checks
### Lokale Entwicklung (Mac Mini — optional) ### Lokale Entwicklung (Mac Mini — optional)
@@ -76,20 +72,18 @@ rsync -avz --exclude node_modules --exclude .next --exclude .git \
- RAG-Service (Vektorsuche fuer Compliance-Dokumente) - RAG-Service (Vektorsuche fuer Compliance-Dokumente)
- Nginx (Reverse Proxy) - Nginx (Reverse Proxy)
**Externe Services (Hetzner/meghshakka) — seit 2026-03-06:** **Externe Services (Production):**
- PostgreSQL 17 @ `46.225.100.82:54321` (sslmode=require) — Schemas: `compliance` (51), `public` (compliance_* + training_* + ucca_* + academy_*) - PostgreSQL 17 (sslmode=require) — Schemas: `compliance`, `public`
- Qdrant @ `qdrant-dev.breakpilot.ai` (HTTPS, API-Key) - Qdrant @ `qdrant-dev.breakpilot.ai` (HTTPS, API-Key)
- Object Storage @ `nbg1.your-objectstorage.com` (S3-kompatibel, TLS) - Object Storage (S3-kompatibel, TLS)
Config via `.env` auf Mac Mini (nicht im Repo): `COMPLIANCE_DATABASE_URL`, `QDRANT_URL`, `QDRANT_API_KEY` Config via `.env` (nicht im Repo): `COMPLIANCE_DATABASE_URL`, `QDRANT_URL`, `QDRANT_API_KEY`
Pruefen: `curl -sf http://macmini:8099/health`
--- ---
## Haupt-URLs ## Haupt-URLs
### Production (Hetzner — primaer) ### Production (Coolify-deployed)
| URL | Service | Beschreibung | | URL | Service | Beschreibung |
|-----|---------|--------------| |-----|---------|--------------|
@@ -145,18 +139,6 @@ Pruefen: `curl -sf http://macmini:8099/health`
| docs | MkDocs/nginx | 8011 | bp-compliance-docs | | docs | MkDocs/nginx | 8011 | bp-compliance-docs |
| core-wait | curl health-check | - | bp-compliance-core-wait | | core-wait | curl health-check | - | bp-compliance-core-wait |
### compliance-tts-service
- Piper TTS + FFmpeg fuer Schulungsvideos
- Speichert Audio/Video in Hetzner Object Storage (nbg1.your-objectstorage.com)
- TTS-Modell: `de_DE-thorsten-high.onnx`
- Dateien: `main.py`, `tts_engine.py`, `video_generator.py`, `storage.py`
### document-crawler
- Dokument-Analyse: PDF, DOCX, XLSX, PPTX
- Gap-Analyse zwischen bestehenden Dokumenten und Compliance-Anforderungen
- IPFS-Archivierung via dsms-gateway
- Kommuniziert mit ai-compliance-sdk (LLM Gateway)
### Docker-Netzwerk ### Docker-Netzwerk
Nutzt das externe Core-Netzwerk: Nutzt das externe Core-Netzwerk:
```yaml ```yaml
@@ -202,8 +184,8 @@ breakpilot-compliance/
├── dsms-gateway/ # IPFS Gateway ├── dsms-gateway/ # IPFS Gateway
├── scripts/ # Helper Scripts ├── scripts/ # Helper Scripts
├── docker-compose.yml # Compliance Compose (~10 Services, platform: arm64) ├── docker-compose.yml # Compliance Compose (~10 Services, platform: arm64)
├── docker-compose.hetzner.yml # Override: arm64→amd64 fuer Hetzner ├── docker-compose.hetzner.yml # Override: arm64→amd64 fuer Coolify Production
└── .gitea/workflows/ci.yaml # CI/CD Pipeline (Lint → Tests → Deploy) └── .gitea/workflows/ci.yaml # CI/CD Pipeline (Lint → Tests → Validierung)
``` ```
--- ---
@@ -213,7 +195,7 @@ breakpilot-compliance/
### Deployment (CI/CD — Standardweg) ### Deployment (CI/CD — Standardweg)
```bash ```bash
# Committen und pushen → CI/CD deployt automatisch auf Hetzner: # Committen und pushen → Coolify deployt automatisch:
git push origin main && git push gitea main git push origin main && git push gitea main
# CI-Status pruefen (im Browser): # CI-Status pruefen (im Browser):
@@ -326,10 +308,6 @@ DELETE /api/v1/projects/{project_id} → Projekt archivieren (Soft Delete)
- `app/sdk/layout.tsx` — liest `?project=` aus searchParams - `app/sdk/layout.tsx` — liest `?project=` aus searchParams
- `app/api/sdk/v1/projects/` — Next.js Proxy zum Backend - `app/api/sdk/v1/projects/` — Next.js Proxy zum Backend
**Multi-Tab:** Tab A (Projekt X) und Tab B (Projekt Y) interferieren nicht — separate BroadcastChannel + localStorage Keys.
**Stammdaten-Kopie:** Neues Projekt mit `copy_from_project_id` → Backend kopiert `companyProfile` aus dem Quell-State. Danach unabhaengig editierbar.
### Backend-Compliance APIs ### Backend-Compliance APIs
``` ```
POST/GET /api/v1/compliance/risks POST/GET /api/v1/compliance/risks
@@ -340,7 +318,7 @@ POST/GET /api/v1/dsr/requests
POST/GET /api/v1/gdpr/exports POST/GET /api/v1/gdpr/exports
POST/GET /api/v1/consent/admin POST/GET /api/v1/consent/admin
# Stammdaten, Versionierung & Change-Requests (Phase 1-6, 2026-03-07) # Stammdaten, Versionierung & Change-Requests
GET/POST/DELETE /api/compliance/company-profile GET/POST/DELETE /api/compliance/company-profile
GET /api/compliance/company-profile/template-context GET /api/compliance/company-profile/template-context
GET /api/compliance/change-requests GET /api/compliance/change-requests
@@ -358,24 +336,6 @@ GET /api/compliance/{doc}/{id}/versions
- UUID-Format, kein `"default"` mehr - UUID-Format, kein `"default"` mehr
- Header `X-Tenant-ID` > Query `tenant_id` > ENV-Fallback - Header `X-Tenant-ID` > Query `tenant_id` > ENV-Fallback
### Migrations (035-038)
| Nr | Datei | Beschreibung |
|----|-------|--------------|
| 035 | `migrations/035_vvt_tenant_isolation.sql` | VVT tenant_id + DSFA/Vendor default→UUID |
| 036 | `migrations/036_company_profile_extend.sql` | Stammdaten JSONB + Regulierungs-Flags |
| 037 | `migrations/037_document_versions.sql` | 5 Versions-Tabellen + current_version |
| 038 | `migrations/038_change_requests.sql` | Change-Requests + Audit-Log |
### Neue Backend-Module
| Datei | Beschreibung |
|-------|--------------|
| `compliance/api/tenant_utils.py` | Shared Tenant-ID Dependency |
| `compliance/api/versioning_utils.py` | Shared Versioning Helper |
| `compliance/api/change_request_routes.py` | CR CRUD + Accept/Reject/Edit |
| `compliance/api/change_request_engine.py` | Regelbasierte CR-Generierung |
| `compliance/api/generation_routes.py` | Dokumentengenerierung aus Stammdaten |
| `compliance/api/document_templates/` | 5 Template-Generatoren (DSFA, VVT, TOM, etc.) |
--- ---
## Wichtige Dateien (Referenz) ## Wichtige Dateien (Referenz)
@@ -383,9 +343,7 @@ GET /api/compliance/{doc}/{id}/versions
| Datei | Beschreibung | | Datei | Beschreibung |
|-------|--------------| |-------|--------------|
| `admin-compliance/app/(sdk)/` | Alle 37+ SDK-Routes | | `admin-compliance/app/(sdk)/` | Alle 37+ SDK-Routes |
| `admin-compliance/app/(sdk)/sdk/change-requests/page.tsx` | Change-Request Inbox | | `admin-compliance/components/sdk/Sidebar/SDKSidebar.tsx` | SDK Navigation |
| `admin-compliance/components/sdk/Sidebar/SDKSidebar.tsx` | SDK Navigation (mit CR-Badge) |
| `admin-compliance/components/sdk/VersionHistory.tsx` | Versions-Timeline-Komponente |
| `admin-compliance/components/sdk/CommandBar.tsx` | Command Palette | | `admin-compliance/components/sdk/CommandBar.tsx` | Command Palette |
| `admin-compliance/lib/sdk/context.tsx` | SDK State (Provider) | | `admin-compliance/lib/sdk/context.tsx` | SDK State (Provider) |
| `backend-compliance/compliance/` | Haupt-Package (50+ Dateien) | | `backend-compliance/compliance/` | Haupt-Package (50+ Dateien) |

61
.env.coolify.example Normal file
View File

@@ -0,0 +1,61 @@
# =========================================================
# BreakPilot Compliance — Coolify Environment Variables
# =========================================================
# Copy these into Coolify's environment variable UI
# for the breakpilot-compliance Docker Compose resource.
# =========================================================
# --- External PostgreSQL (Coolify-managed, same as Core) ---
COMPLIANCE_DATABASE_URL=postgresql://breakpilot:CHANGE_ME@<coolify-postgres-hostname>:5432/breakpilot_db
# --- Security ---
JWT_SECRET=CHANGE_ME_SAME_AS_CORE
# --- External S3 Storage (same as Core) ---
S3_ENDPOINT=<s3-endpoint-host:port>
S3_ACCESS_KEY=CHANGE_ME_SAME_AS_CORE
S3_SECRET_KEY=CHANGE_ME_SAME_AS_CORE
S3_SECURE=true
# --- External Qdrant ---
QDRANT_URL=https://<qdrant-hostname>
QDRANT_API_KEY=CHANGE_ME_QDRANT_API_KEY
# --- Session ---
SESSION_TTL_HOURS=24
# --- SMTP (Real mail server) ---
SMTP_HOST=smtp.example.com
SMTP_PORT=587
SMTP_USERNAME=compliance@breakpilot.ai
SMTP_PASSWORD=CHANGE_ME_SMTP_PASSWORD
SMTP_FROM_NAME=BreakPilot Compliance
SMTP_FROM_ADDR=compliance@breakpilot.ai
# --- LLM Configuration ---
COMPLIANCE_LLM_PROVIDER=anthropic
SELF_HOSTED_LLM_URL=
SELF_HOSTED_LLM_MODEL=
COMPLIANCE_LLM_MAX_TOKENS=4096
COMPLIANCE_LLM_TEMPERATURE=0.3
COMPLIANCE_LLM_TIMEOUT=120
ANTHROPIC_API_KEY=CHANGE_ME_ANTHROPIC_KEY
ANTHROPIC_DEFAULT_MODEL=claude-sonnet-4-5-20250929
# --- Ollama (optional) ---
OLLAMA_URL=
OLLAMA_DEFAULT_MODEL=
COMPLIANCE_LLM_MODEL=
# --- LLM Fallback ---
LLM_FALLBACK_PROVIDER=
# --- PII & Audit ---
PII_REDACTION_ENABLED=true
PII_REDACTION_LEVEL=standard
AUDIT_RETENTION_DAYS=365
AUDIT_LOG_PROMPTS=true
# --- Frontend URLs (build args) ---
NEXT_PUBLIC_API_URL=https://api-compliance.breakpilot.ai
NEXT_PUBLIC_SDK_URL=https://sdk.breakpilot.ai

View File

@@ -7,7 +7,7 @@
# Node.js: admin-compliance, developer-portal # Node.js: admin-compliance, developer-portal
# #
# Workflow: # Workflow:
# Push auf main → Tests → Build → Deploy (Hetzner) # Push auf main → Tests → Deploy (Coolify)
# Pull Request → Lint + Tests (kein Deploy) # Pull Request → Lint + Tests (kein Deploy)
name: CI/CD name: CI/CD
@@ -186,10 +186,11 @@ jobs:
python scripts/validate-controls.py python scripts/validate-controls.py
# ======================================== # ========================================
# Build & Deploy auf Hetzner (nur main, kein PR) # Deploy via Coolify (nur main, kein PR)
# ======================================== # ========================================
deploy-hetzner: deploy-coolify:
name: Deploy
runs-on: docker runs-on: docker
if: github.event_name == 'push' && github.ref == 'refs/heads/main' if: github.event_name == 'push' && github.ref == 'refs/heads/main'
needs: needs:
@@ -198,92 +199,11 @@ jobs:
- test-python-document-crawler - test-python-document-crawler
- test-python-dsms-gateway - test-python-dsms-gateway
- validate-canonical-controls - validate-canonical-controls
container: docker:27-cli container:
image: alpine:latest
steps: steps:
- name: Deploy - name: Trigger Coolify deploy
run: | run: |
set -euo pipefail apk add --no-cache curl
DEPLOY_DIR="/opt/breakpilot-compliance" curl -sf "${{ secrets.COOLIFY_WEBHOOK }}" \
COMPOSE_FILES="-f docker-compose.yml -f docker-compose.hetzner.yml" -H "Authorization: Bearer ${{ secrets.COOLIFY_TOKEN }}"
COMMIT_SHA="${GITHUB_SHA:-unknown}"
SHORT_SHA="${COMMIT_SHA:0:8}"
REPO_URL="${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}.git"
echo "=== BreakPilot Compliance Deploy ==="
echo "Commit: ${SHORT_SHA}"
echo "Deploy Dir: ${DEPLOY_DIR}"
echo ""
# Der Runner laeuft in einem Container mit Docker-Socket-Zugriff,
# hat aber KEINEN direkten Zugriff auf das Host-Dateisystem.
# Loesung: Alpine-Helper-Container mit Host-Bind-Mount fuer Git-Ops.
# 1. Repo auf dem Host erstellen/aktualisieren via Helper-Container
echo "=== Updating code on host ==="
docker run --rm \
-v "${DEPLOY_DIR}:${DEPLOY_DIR}" \
--entrypoint sh \
alpine/git:latest \
-c "
if [ ! -d '${DEPLOY_DIR}/.git' ]; then
echo 'Erstmaliges Klonen nach ${DEPLOY_DIR}...'
git clone '${REPO_URL}' '${DEPLOY_DIR}'
else
cd '${DEPLOY_DIR}'
git fetch origin main
git reset --hard origin/main
fi
"
echo "Code aktualisiert auf ${SHORT_SHA}"
# 2. .env sicherstellen (muss einmalig manuell angelegt werden)
docker run --rm -v "${DEPLOY_DIR}:${DEPLOY_DIR}" alpine \
sh -c "
if [ ! -f '${DEPLOY_DIR}/.env' ]; then
echo 'WARNUNG: ${DEPLOY_DIR}/.env fehlt!'
echo 'Bitte einmalig auf dem Host anlegen.'
echo 'Deploy wird fortgesetzt (Services starten ggf. mit Defaults).'
else
echo '.env vorhanden'
fi
"
# 3. Build + Deploy via Helper-Container mit Docker-Socket + Deploy-Dir
# docker compose muss die YAML-Dateien lesen koennen, daher
# alles in einem Container mit beiden Mounts ausfuehren.
echo ""
echo "=== Building + Deploying ==="
docker run --rm \
-v /var/run/docker.sock:/var/run/docker.sock \
-v "${DEPLOY_DIR}:${DEPLOY_DIR}" \
-w "${DEPLOY_DIR}" \
docker:27-cli \
sh -c "
COMPOSE_FILES='-f docker-compose.yml -f docker-compose.hetzner.yml'
echo '=== Building Docker Images ==='
docker compose \${COMPOSE_FILES} build --parallel \
admin-compliance \
backend-compliance \
ai-compliance-sdk \
developer-portal
echo ''
echo '=== Starting containers ==='
docker compose \${COMPOSE_FILES} up -d --remove-orphans \
admin-compliance \
backend-compliance \
ai-compliance-sdk \
developer-portal
echo ''
echo '=== Health Checks ==='
sleep 10
for svc in bp-compliance-admin bp-compliance-backend bp-compliance-ai-sdk bp-compliance-developer-portal; do
STATUS=\$(docker inspect --format='{{.State.Status}}' \"\${svc}\" 2>/dev/null || echo 'not found')
echo \"\${svc}: \${STATUS}\"
done
"
echo ""
echo "=== Deploy abgeschlossen: ${SHORT_SHA} ==="

View File

@@ -5,8 +5,8 @@
# #
# Phasen: gesetze, eu, templates, datenschutz, verbraucherschutz, verify, version, all # Phasen: gesetze, eu, templates, datenschutz, verbraucherschutz, verify, version, all
# #
# Voraussetzung: RAG-Service und Qdrant muessen auf Hetzner laufen. # Voraussetzung: RAG-Service und Qdrant muessen auf Coolify laufen.
# Die BreakPilot-Services muessen deployed sein (ci.yaml deploy-hetzner). # Die BreakPilot-Services muessen deployed sein (ci.yaml deploy-coolify).
name: RAG Ingestion name: RAG Ingestion

View File

@@ -37,8 +37,8 @@ WORKDIR /app
ENV NODE_ENV=production ENV NODE_ENV=production
# Create non-root user # Create non-root user
RUN addgroup --system --gid 1001 nodejs RUN addgroup -S -g 1001 nodejs
RUN adduser --system --uid 1001 nextjs RUN adduser -S -u 1001 -G nodejs nextjs
# Copy built assets # Copy built assets
COPY --from=builder /app/public ./public COPY --from=builder /app/public ./public

View File

@@ -17,7 +17,7 @@ COPY . .
RUN CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o /ai-compliance-sdk ./cmd/server RUN CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o /ai-compliance-sdk ./cmd/server
# Runtime stage # Runtime stage
FROM alpine:3.19 FROM alpine:3.21
WORKDIR /app WORKDIR /app

View File

@@ -39,7 +39,7 @@ go build -o server ./cmd/server
# Production: CI/CD (automatisch bei Push auf main) # Production: CI/CD (automatisch bei Push auf main)
git push origin main && git push gitea main git push origin main && git push gitea main
# → Gitea Actions: Tests → Build → Deploy auf Hetzner # → Gitea Actions: Tests → Build → Deploy auf Coolify
# → Status: https://gitea.meghsakha.com/Benjamin_Boenisch/breakpilot-compliance/actions # → Status: https://gitea.meghsakha.com/Benjamin_Boenisch/breakpilot-compliance/actions
# Alternativ: mit Docker (lokal) # Alternativ: mit Docker (lokal)
@@ -466,7 +466,7 @@ Tests laufen automatisch bei jedem Push via Gitea Actions (`.gitea/workflows/ci.
| `test-python-document-crawler` | `python:3.12-slim` | `pytest tests/` | | `test-python-document-crawler` | `python:3.12-slim` | `pytest tests/` |
| `test-python-dsms-gateway` | `python:3.12-slim` | `pytest test_main.py` | | `test-python-dsms-gateway` | `python:3.12-slim` | `pytest test_main.py` |
Nach erfolgreichen Tests: automatisches Deploy auf Hetzner (`deploy-hetzner` Job). Nach erfolgreichen Tests: automatisches Deploy auf Coolify (`deploy-coolify` Job).
### Spezifische Tests ### Spezifische Tests

View File

@@ -15,6 +15,7 @@ from typing import Any, Optional
from fastapi import APIRouter, HTTPException, Header from fastapi import APIRouter, HTTPException, Header
from pydantic import BaseModel from pydantic import BaseModel
from sqlalchemy import text
from database import SessionLocal from database import SessionLocal
@@ -75,13 +76,13 @@ async def get_compliance_scope(
db = SessionLocal() db = SessionLocal()
try: try:
row = db.execute( row = db.execute(
"""SELECT tenant_id, text("""SELECT tenant_id,
state->'compliance_scope' AS scope, state->'compliance_scope' AS scope,
created_at, created_at,
updated_at updated_at
FROM sdk_states FROM sdk_states
WHERE tenant_id = :tid WHERE tenant_id = :tid
AND state ? 'compliance_scope'""", AND state ? 'compliance_scope'"""),
{"tid": tid}, {"tid": tid},
).fetchone() ).fetchone()
@@ -106,22 +107,22 @@ async def upsert_compliance_scope(
db = SessionLocal() db = SessionLocal()
try: try:
db.execute( db.execute(
"""INSERT INTO sdk_states (tenant_id, state) text("""INSERT INTO sdk_states (tenant_id, state)
VALUES (:tid, jsonb_build_object('compliance_scope', :scope::jsonb)) VALUES (:tid, jsonb_build_object('compliance_scope', :scope::jsonb))
ON CONFLICT (tenant_id) DO UPDATE ON CONFLICT (tenant_id) DO UPDATE
SET state = sdk_states.state || jsonb_build_object('compliance_scope', :scope::jsonb), SET state = sdk_states.state || jsonb_build_object('compliance_scope', :scope::jsonb),
updated_at = NOW()""", updated_at = NOW()"""),
{"tid": tid, "scope": scope_json}, {"tid": tid, "scope": scope_json},
) )
db.commit() db.commit()
row = db.execute( row = db.execute(
"""SELECT tenant_id, text("""SELECT tenant_id,
state->'compliance_scope' AS scope, state->'compliance_scope' AS scope,
created_at, created_at,
updated_at updated_at
FROM sdk_states FROM sdk_states
WHERE tenant_id = :tid""", WHERE tenant_id = :tid"""),
{"tid": tid}, {"tid": tid},
).fetchone() ).fetchone()

View File

@@ -15,6 +15,7 @@ from typing import Optional
import httpx import httpx
from fastapi import APIRouter, File, Form, Header, UploadFile, HTTPException from fastapi import APIRouter, File, Form, Header, UploadFile, HTTPException
from pydantic import BaseModel from pydantic import BaseModel
from sqlalchemy import text
from database import SessionLocal from database import SessionLocal
@@ -291,11 +292,11 @@ async def analyze_document(
db = SessionLocal() db = SessionLocal()
try: try:
db.execute( db.execute(
"""INSERT INTO compliance_imported_documents text("""INSERT INTO compliance_imported_documents
(id, tenant_id, filename, file_type, file_size, detected_type, detection_confidence, (id, tenant_id, filename, file_type, file_size, detected_type, detection_confidence,
extracted_text, extracted_entities, recommendations, status, analyzed_at) extracted_text, extracted_entities, recommendations, status, analyzed_at)
VALUES (:id, :tenant_id, :filename, :file_type, :file_size, :detected_type, :confidence, VALUES (:id, :tenant_id, :filename, :file_type, :file_size, :detected_type, :confidence,
:text, :entities::jsonb, :recommendations::jsonb, 'analyzed', NOW())""", :text, :entities::jsonb, :recommendations::jsonb, 'analyzed', NOW())"""),
{ {
"id": doc_id, "id": doc_id,
"tenant_id": tenant_id, "tenant_id": tenant_id,
@@ -313,9 +314,9 @@ async def analyze_document(
if total_gaps > 0: if total_gaps > 0:
import json import json
db.execute( db.execute(
"""INSERT INTO compliance_gap_analyses text("""INSERT INTO compliance_gap_analyses
(tenant_id, document_id, total_gaps, critical_gaps, high_gaps, medium_gaps, low_gaps, gaps, recommended_packages) (tenant_id, document_id, total_gaps, critical_gaps, high_gaps, medium_gaps, low_gaps, gaps, recommended_packages)
VALUES (:tenant_id, :document_id, :total, :critical, :high, :medium, :low, :gaps::jsonb, :packages::jsonb)""", VALUES (:tenant_id, :document_id, :total, :critical, :high, :medium, :low, :gaps::jsonb, :packages::jsonb)"""),
{ {
"tenant_id": tenant_id, "tenant_id": tenant_id,
"document_id": doc_id, "document_id": doc_id,
@@ -358,7 +359,7 @@ async def get_gap_analysis(
db = SessionLocal() db = SessionLocal()
try: try:
result = db.execute( result = db.execute(
"SELECT * FROM compliance_gap_analyses WHERE document_id = :doc_id AND tenant_id = :tid", text("SELECT * FROM compliance_gap_analyses WHERE document_id = :doc_id AND tenant_id = :tid"),
{"doc_id": document_id, "tid": tid}, {"doc_id": document_id, "tid": tid},
).fetchone() ).fetchone()
if not result: if not result:
@@ -374,11 +375,11 @@ async def list_documents(tenant_id: str = "default"):
db = SessionLocal() db = SessionLocal()
try: try:
result = db.execute( result = db.execute(
"""SELECT id, filename, file_type, file_size, detected_type, detection_confidence, text("""SELECT id, filename, file_type, file_size, detected_type, detection_confidence,
extracted_entities, recommendations, status, analyzed_at, created_at extracted_entities, recommendations, status, analyzed_at, created_at
FROM compliance_imported_documents FROM compliance_imported_documents
WHERE tenant_id = :tenant_id WHERE tenant_id = :tenant_id
ORDER BY created_at DESC""", ORDER BY created_at DESC"""),
{"tenant_id": tenant_id}, {"tenant_id": tenant_id},
) )
rows = result.fetchall() rows = result.fetchall()
@@ -424,11 +425,11 @@ async def delete_document(
try: try:
# Delete gap analysis first (FK dependency) # Delete gap analysis first (FK dependency)
db.execute( db.execute(
"DELETE FROM compliance_gap_analyses WHERE document_id = :doc_id AND tenant_id = :tid", text("DELETE FROM compliance_gap_analyses WHERE document_id = :doc_id AND tenant_id = :tid"),
{"doc_id": document_id, "tid": tid}, {"doc_id": document_id, "tid": tid},
) )
result = db.execute( result = db.execute(
"DELETE FROM compliance_imported_documents WHERE id = :doc_id AND tenant_id = :tid", text("DELETE FROM compliance_imported_documents WHERE id = :doc_id AND tenant_id = :tid"),
{"doc_id": document_id, "tid": tid}, {"doc_id": document_id, "tid": tid},
) )
db.commit() db.commit()

View File

@@ -17,6 +17,7 @@ from typing import Optional
import httpx import httpx
from fastapi import APIRouter, File, Form, UploadFile, HTTPException from fastapi import APIRouter, File, Form, UploadFile, HTTPException
from pydantic import BaseModel from pydantic import BaseModel
from sqlalchemy import text
from database import SessionLocal from database import SessionLocal
@@ -366,13 +367,13 @@ async def scan_dependencies(
db = SessionLocal() db = SessionLocal()
try: try:
db.execute( db.execute(
"""INSERT INTO compliance_screenings text("""INSERT INTO compliance_screenings
(id, tenant_id, status, sbom_format, sbom_version, (id, tenant_id, status, sbom_format, sbom_version,
total_components, total_issues, critical_issues, high_issues, medium_issues, low_issues, total_components, total_issues, critical_issues, high_issues, medium_issues, low_issues,
sbom_data, started_at, completed_at) sbom_data, started_at, completed_at)
VALUES (:id, :tenant_id, 'completed', 'CycloneDX', '1.5', VALUES (:id, :tenant_id, 'completed', 'CycloneDX', '1.5',
:total_components, :total_issues, :critical, :high, :medium, :low, :total_components, :total_issues, :critical, :high, :medium, :low,
:sbom_data::jsonb, :started_at, :completed_at)""", :sbom_data::jsonb, :started_at, :completed_at)"""),
{ {
"id": screening_id, "id": screening_id,
"tenant_id": tenant_id, "tenant_id": tenant_id,
@@ -391,11 +392,11 @@ async def scan_dependencies(
# Persist security issues # Persist security issues
for issue in issues: for issue in issues:
db.execute( db.execute(
"""INSERT INTO compliance_security_issues text("""INSERT INTO compliance_security_issues
(id, screening_id, severity, title, description, cve, cvss, (id, screening_id, severity, title, description, cve, cvss,
affected_component, affected_version, fixed_in, remediation, status) affected_component, affected_version, fixed_in, remediation, status)
VALUES (:id, :screening_id, :severity, :title, :description, :cve, :cvss, VALUES (:id, :screening_id, :severity, :title, :description, :cve, :cvss,
:component, :version, :fixed_in, :remediation, :status)""", :component, :version, :fixed_in, :remediation, :status)"""),
{ {
"id": issue["id"], "id": issue["id"],
"screening_id": screening_id, "screening_id": screening_id,
@@ -486,10 +487,10 @@ async def get_screening(screening_id: str):
db = SessionLocal() db = SessionLocal()
try: try:
result = db.execute( result = db.execute(
"""SELECT id, status, sbom_format, sbom_version, text("""SELECT id, status, sbom_format, sbom_version,
total_components, total_issues, critical_issues, high_issues, total_components, total_issues, critical_issues, high_issues,
medium_issues, low_issues, sbom_data, started_at, completed_at medium_issues, low_issues, sbom_data, started_at, completed_at
FROM compliance_screenings WHERE id = :id""", FROM compliance_screenings WHERE id = :id"""),
{"id": screening_id}, {"id": screening_id},
) )
row = result.fetchone() row = result.fetchone()
@@ -498,9 +499,9 @@ async def get_screening(screening_id: str):
# Fetch issues # Fetch issues
issues_result = db.execute( issues_result = db.execute(
"""SELECT id, severity, title, description, cve, cvss, text("""SELECT id, severity, title, description, cve, cvss,
affected_component, affected_version, fixed_in, remediation, status affected_component, affected_version, fixed_in, remediation, status
FROM compliance_security_issues WHERE screening_id = :id""", FROM compliance_security_issues WHERE screening_id = :id"""),
{"id": screening_id}, {"id": screening_id},
) )
issues_rows = issues_result.fetchall() issues_rows = issues_result.fetchall()
@@ -566,12 +567,12 @@ async def list_screenings(tenant_id: str = "default"):
db = SessionLocal() db = SessionLocal()
try: try:
result = db.execute( result = db.execute(
"""SELECT id, status, total_components, total_issues, text("""SELECT id, status, total_components, total_issues,
critical_issues, high_issues, medium_issues, low_issues, critical_issues, high_issues, medium_issues, low_issues,
started_at, completed_at, created_at started_at, completed_at, created_at
FROM compliance_screenings FROM compliance_screenings
WHERE tenant_id = :tenant_id WHERE tenant_id = :tenant_id
ORDER BY created_at DESC""", ORDER BY created_at DESC"""),
{"tenant_id": tenant_id}, {"tenant_id": tenant_id},
) )
rows = result.fetchall() rows = result.fetchall()

View File

@@ -12,7 +12,7 @@ RUN npm install
# Copy source code # Copy source code
COPY . . COPY . .
# Ensure public directory exists # Ensure public directory exists (may not have static assets)
RUN mkdir -p public RUN mkdir -p public
# Build the application # Build the application
@@ -27,8 +27,8 @@ WORKDIR /app
ENV NODE_ENV=production ENV NODE_ENV=production
# Create non-root user # Create non-root user
RUN addgroup --system --gid 1001 nodejs RUN addgroup -S -g 1001 nodejs
RUN adduser --system --uid 1001 nextjs RUN adduser -S -u 1001 -G nodejs nextjs
# Copy built assets # Copy built assets
COPY --from=builder /app/public ./public COPY --from=builder /app/public ./public

272
docker-compose.coolify.yml Normal file
View File

@@ -0,0 +1,272 @@
# =========================================================
# BreakPilot Compliance — Compliance SDK Platform (Coolify)
# =========================================================
# Requires: breakpilot-core must be running
# Deployed via Coolify. SSL termination handled by Traefik.
# External services (managed separately in Coolify):
# - PostgreSQL, Qdrant, S3-compatible storage
# =========================================================
networks:
breakpilot-network:
external: true
name: breakpilot-network
coolify:
external: true
name: coolify
volumes:
dsms_data:
services:
# =========================================================
# FRONTEND
# =========================================================
admin-compliance:
build:
context: ./admin-compliance
dockerfile: Dockerfile
args:
NEXT_PUBLIC_API_URL: ${NEXT_PUBLIC_API_URL:-https://api-compliance.breakpilot.ai}
NEXT_PUBLIC_SDK_URL: ${NEXT_PUBLIC_SDK_URL:-https://sdk.breakpilot.ai}
container_name: bp-compliance-admin
labels:
- "traefik.docker.network=coolify"
expose:
- "3000"
environment:
NODE_ENV: production
DATABASE_URL: ${COMPLIANCE_DATABASE_URL}
BACKEND_URL: http://backend-compliance:8002
CONSENT_SERVICE_URL: http://bp-core-consent-service:8081
SDK_URL: http://ai-compliance-sdk:8090
OLLAMA_URL: ${OLLAMA_URL:-}
COMPLIANCE_LLM_MODEL: ${COMPLIANCE_LLM_MODEL:-}
depends_on:
backend-compliance:
condition: service_started
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://127.0.0.1:3000/"]
interval: 30s
timeout: 10s
start_period: 30s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
- coolify
developer-portal:
build:
context: ./developer-portal
dockerfile: Dockerfile
container_name: bp-compliance-developer-portal
labels:
- "traefik.docker.network=coolify"
expose:
- "3000"
environment:
NODE_ENV: production
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://127.0.0.1:3000/"]
interval: 30s
timeout: 10s
start_period: 30s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
- coolify
# =========================================================
# BACKEND
# =========================================================
backend-compliance:
build:
context: ./backend-compliance
dockerfile: Dockerfile
container_name: bp-compliance-backend
labels:
- "traefik.docker.network=coolify"
expose:
- "8002"
environment:
PORT: 8002
DATABASE_URL: ${COMPLIANCE_DATABASE_URL}
JWT_SECRET: ${JWT_SECRET}
ENVIRONMENT: production
CONSENT_SERVICE_URL: http://bp-core-consent-service:8081
VALKEY_URL: redis://bp-core-valkey:6379/0
SESSION_TTL_HOURS: ${SESSION_TTL_HOURS:-24}
COMPLIANCE_LLM_PROVIDER: ${COMPLIANCE_LLM_PROVIDER:-anthropic}
SELF_HOSTED_LLM_URL: ${SELF_HOSTED_LLM_URL:-}
SELF_HOSTED_LLM_MODEL: ${SELF_HOSTED_LLM_MODEL:-}
COMPLIANCE_LLM_MAX_TOKENS: ${COMPLIANCE_LLM_MAX_TOKENS:-4096}
COMPLIANCE_LLM_TEMPERATURE: ${COMPLIANCE_LLM_TEMPERATURE:-0.3}
COMPLIANCE_LLM_TIMEOUT: ${COMPLIANCE_LLM_TIMEOUT:-120}
ANTHROPIC_API_KEY: ${ANTHROPIC_API_KEY:-}
SMTP_HOST: ${SMTP_HOST}
SMTP_PORT: ${SMTP_PORT:-587}
SMTP_USERNAME: ${SMTP_USERNAME}
SMTP_PASSWORD: ${SMTP_PASSWORD}
SMTP_FROM_NAME: ${SMTP_FROM_NAME:-BreakPilot Compliance}
SMTP_FROM_ADDR: ${SMTP_FROM_ADDR:-compliance@breakpilot.ai}
RAG_SERVICE_URL: http://bp-core-rag-service:8097
healthcheck:
test: ["CMD", "curl", "-f", "http://127.0.0.1:8002/health"]
interval: 30s
timeout: 10s
start_period: 15s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
- coolify
# =========================================================
# SDK SERVICES
# =========================================================
ai-compliance-sdk:
build:
context: ./ai-compliance-sdk
dockerfile: Dockerfile
container_name: bp-compliance-ai-sdk
labels:
- "traefik.docker.network=coolify"
expose:
- "8090"
environment:
PORT: 8090
ENVIRONMENT: production
DATABASE_URL: ${COMPLIANCE_DATABASE_URL}
JWT_SECRET: ${JWT_SECRET}
LLM_PROVIDER: ${COMPLIANCE_LLM_PROVIDER:-anthropic}
LLM_FALLBACK_PROVIDER: ${LLM_FALLBACK_PROVIDER:-}
OLLAMA_URL: ${OLLAMA_URL:-}
OLLAMA_DEFAULT_MODEL: ${OLLAMA_DEFAULT_MODEL:-}
ANTHROPIC_API_KEY: ${ANTHROPIC_API_KEY:-}
ANTHROPIC_DEFAULT_MODEL: ${ANTHROPIC_DEFAULT_MODEL:-claude-sonnet-4-5-20250929}
PII_REDACTION_ENABLED: ${PII_REDACTION_ENABLED:-true}
PII_REDACTION_LEVEL: ${PII_REDACTION_LEVEL:-standard}
AUDIT_RETENTION_DAYS: ${AUDIT_RETENTION_DAYS:-365}
AUDIT_LOG_PROMPTS: ${AUDIT_LOG_PROMPTS:-true}
ALLOWED_ORIGINS: "*"
TTS_SERVICE_URL: http://compliance-tts-service:8095
QDRANT_URL: ${QDRANT_URL}
QDRANT_API_KEY: ${QDRANT_API_KEY:-}
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://127.0.0.1:8090/health"]
interval: 30s
timeout: 3s
start_period: 10s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
- coolify
# =========================================================
# TTS SERVICE (Piper TTS + FFmpeg)
# =========================================================
compliance-tts-service:
build:
context: ./compliance-tts-service
dockerfile: Dockerfile
container_name: bp-compliance-tts
expose:
- "8095"
environment:
MINIO_ENDPOINT: ${S3_ENDPOINT}
MINIO_ACCESS_KEY: ${S3_ACCESS_KEY}
MINIO_SECRET_KEY: ${S3_SECRET_KEY}
MINIO_SECURE: ${S3_SECURE:-true}
PIPER_MODEL_PATH: /app/models/de_DE-thorsten-high.onnx
healthcheck:
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://127.0.0.1:8095/health')"]
interval: 30s
timeout: 10s
start_period: 60s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
# =========================================================
# DATA SOVEREIGNTY
# =========================================================
dsms-node:
build:
context: ./dsms-node
dockerfile: Dockerfile
container_name: bp-compliance-dsms-node
expose:
- "4001"
- "5001"
- "8080"
volumes:
- dsms_data:/data/ipfs
environment:
IPFS_PROFILE: server
healthcheck:
test: ["CMD-SHELL", "ipfs id"]
interval: 30s
timeout: 10s
start_period: 30s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
dsms-gateway:
build:
context: ./dsms-gateway
dockerfile: Dockerfile
container_name: bp-compliance-dsms-gateway
expose:
- "8082"
environment:
IPFS_API_URL: http://dsms-node:5001
IPFS_GATEWAY_URL: http://dsms-node:8080
JWT_SECRET: ${JWT_SECRET}
depends_on:
dsms-node:
condition: service_healthy
healthcheck:
test: ["CMD", "curl", "-f", "http://127.0.0.1:8082/health"]
interval: 30s
timeout: 10s
start_period: 15s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network
# =========================================================
# DOCUMENT CRAWLER & AUTO-ONBOARDING
# =========================================================
document-crawler:
build:
context: ./document-crawler
dockerfile: Dockerfile
container_name: bp-compliance-document-crawler
expose:
- "8098"
environment:
PORT: 8098
DATABASE_URL: ${COMPLIANCE_DATABASE_URL}
LLM_GATEWAY_URL: http://ai-compliance-sdk:8090
DSMS_GATEWAY_URL: http://dsms-gateway:8082
CRAWL_BASE_PATH: /data/crawl
MAX_FILE_SIZE_MB: 50
volumes:
- /tmp/breakpilot-crawl-data:/data/crawl:ro
healthcheck:
test: ["CMD", "curl", "-f", "http://127.0.0.1:8098/health"]
interval: 30s
timeout: 10s
start_period: 15s
retries: 3
restart: unless-stopped
networks:
- breakpilot-network

View File

@@ -1,16 +1,16 @@
# ========================================================= # =========================================================
# BreakPilot Compliance — Hetzner Override # BreakPilot Compliance — Coolify Production Override
# ========================================================= # =========================================================
# Verwendung: docker compose -f docker-compose.yml -f docker-compose.hetzner.yml up -d # Verwendung: docker compose -f docker-compose.yml -f docker-compose.hetzner.yml up -d
# #
# Aenderungen gegenueber docker-compose.yml: # Aenderungen gegenueber docker-compose.yml:
# - Platform: arm64 → amd64 (Hetzner = x86_64) # - Platform: arm64 → amd64 (Coolify = x86_64)
# - Network: external → auto-create (kein breakpilot-core auf Hetzner) # - Network: external → auto-create (kein breakpilot-core auf Coolify)
# - depends_on: core-health-check entfernt (kein Core auf Hetzner) # - depends_on: core-health-check entfernt (kein Core auf Coolify)
# - API URLs: auf Hetzner-interne Adressen angepasst # - API URLs: auf Coolify-interne Adressen angepasst
# ========================================================= # =========================================================
# Auf Hetzner laeuft kein breakpilot-core, daher Network selbst erstellen # Auf Coolify laeuft kein breakpilot-core, daher Network selbst erstellen
networks: networks:
breakpilot-network: breakpilot-network:
external: false external: false
@@ -18,9 +18,9 @@ networks:
services: services:
# Core-Health-Check deaktivieren (Core laeuft nicht auf Hetzner) # Core-Health-Check deaktivieren (Core laeuft nicht auf Coolify)
core-health-check: core-health-check:
entrypoint: ["sh", "-c", "echo 'Core health check skipped on Hetzner' && exit 0"] entrypoint: ["sh", "-c", "echo 'Core health check skipped on Coolify' && exit 0"]
restart: "no" restart: "no"
admin-compliance: admin-compliance:

View File

@@ -1,15 +1,15 @@
# CI/CD Pipeline # CI/CD Pipeline
Übersicht über den Deployment-Prozess für Breakpilot. Uebersicht ueber den Deployment-Prozess fuer BreakPilot Compliance.
## Übersicht ## Uebersicht
| Komponente | Build-Tool | Deployment | | Komponente | Build-Tool | Deployment |
|------------|------------|------------| |------------|------------|------------|
| Frontend (Next.js) | Docker | Mac Mini | | Frontend (Next.js) | Docker | Coolify (automatisch) |
| Backend (FastAPI) | Docker | Mac Mini | | Backend (FastAPI) | Docker | Coolify (automatisch) |
| Go Services | Docker (Multi-stage) | Mac Mini | | Go Services | Docker (Multi-stage) | Coolify (automatisch) |
| Documentation | MkDocs | Docker (Nginx) | | Documentation | MkDocs | Docker (Nginx, lokal) |
## Deployment-Architektur ## Deployment-Architektur
@@ -17,386 +17,129 @@
┌─────────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────────┐
│ Entwickler-MacBook │ │ Entwickler-MacBook │
│ │ │ │
│ breakpilot-pwa/ │ breakpilot-compliance/
│ ├── studio-v2/ (Next.js Frontend) │ ├── admin-compliance/ (Next.js Dashboard)
│ ├── admin-v2/ (Next.js Admin) │ ├── backend-compliance/ (Python FastAPI)
│ ├── backend/ (Python FastAPI) │ ├── ai-compliance-sdk/ (Go/Gin)
│ ├── consent-service/ (Go Service) │ ├── developer-portal/ (Next.js)
── klausur-service/ (Python FastAPI) ── docs-src/ (MkDocs)
│ ├── voice-service/ (Python FastAPI) │
│ ├── ai-compliance-sdk/ (Go Service) │
│ └── docs-src/ (MkDocs) │
│ │ │ │
$ ./sync-and-deploy.sh git push origin main && git push gitea main
└───────────────────────────────┬─────────────────────────────────┘ └───────────────────────────────┬─────────────────────────────────┘
rsync + SSH git push
┌─────────────────────────────────────────────────────────────────┐ ┌─────────────────────────────────────────────────────────────────┐
Mac Mini Server Gitea (gitea.meghsakha.com)
│ │ │ │
Docker Compose Gitea Actions CI:
│ ├── website (Port 3000) │ ├── test-go-ai-compliance
│ ├── studio-v2 (Port 3001) │ ├── test-python-backend-compliance
│ ├── admin-v2 (Port 3002) │ ├── test-python-document-crawler
│ ├── backend (Port 8000) │ ├── test-python-dsms-gateway
── consent-service (Port 8081) ── validate-canonical-controls
│ ├── klausur-service (Port 8086) │
│ ├── voice-service (Port 8082) │
│ ├── ai-compliance-sdk (Port 8090) │
│ ├── docs (Port 8009) │
│ ├── postgres │
│ ├── valkey (Redis) │
│ ├── qdrant (extern: qdrant-dev.breakpilot.ai) │
│ └── object-storage (extern: nbg1.your-objectstorage.com) │
│ │ │ │
│ Coolify Webhook → Build + Deploy (automatisch) │
└─────────────────────────────────────────────────────────────────┘
│ auto-deploy
┌─────────────────────────────────────────────────────────────────┐
│ Production (Coolify) │
│ │
│ ├── admin-dev.breakpilot.ai (Admin Compliance) │
│ ├── api-dev.breakpilot.ai (Backend API) │
│ ├── sdk-dev.breakpilot.ai (AI SDK) │
│ └── developers-dev.breakpilot.ai (Developer Portal) │
└─────────────────────────────────────────────────────────────────┘ └─────────────────────────────────────────────────────────────────┘
``` ```
## Sync & Deploy Workflow ## Workflow
### 1. Dateien synchronisieren ### 1. Code entwickeln und committen
```bash ```bash
# Sync aller relevanten Verzeichnisse zum Mac Mini # Code auf MacBook bearbeiten
rsync -avz --delete \ # Committen und zu beiden Remotes pushen:
--exclude 'node_modules' \ git push origin main && git push gitea main
--exclude '.next' \
--exclude '.git' \
--exclude '__pycache__' \
--exclude 'venv' \
--exclude '.pytest_cache' \
/Users/benjaminadmin/Projekte/breakpilot-pwa/ \
macmini:/Users/benjaminadmin/Projekte/breakpilot-pwa/
``` ```
### 2. Container bauen ### 2. Automatische Tests (Gitea Actions)
Push auf gitea triggert automatisch die CI-Pipeline:
- **Go Tests:** `ai-compliance-sdk` Unit Tests
- **Python Tests:** `backend-compliance`, `document-crawler`, `dsms-gateway`
- **Validierung:** Canonical Controls JSON-Validierung
- **Lint:** Go, Python, Node.js (nur bei PRs)
### 3. Automatisches Deployment (Coolify)
Nach erfolgreichem Push baut Coolify automatisch alle Services und deployt sie.
**WICHTIG:** Niemals manuell in Coolify auf "Redeploy" klicken!
### 4. Health Checks
```bash ```bash
# Einzelnen Service bauen # Production Health pruefen
ssh macmini "/usr/local/bin/docker compose \ curl -sf https://api-dev.breakpilot.ai/health
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \ curl -sf https://sdk-dev.breakpilot.ai/health
build --no-cache <service-name>"
# Beispiele:
# studio-v2, admin-v2, website, backend, klausur-service, docs
``` ```
### 3. Container deployen ## CI Pipeline-Konfiguration
```bash **Datei:** `.gitea/workflows/ci.yaml`
# Container neu starten
ssh macmini "/usr/local/bin/docker compose \ ```yaml
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \ on:
up -d <service-name>" push:
branches: [main, develop]
pull_request:
branches: [main, develop]
jobs:
test-go-ai-compliance: # Go Unit Tests
test-python-backend: # Python Unit Tests
test-python-document-crawler:
test-python-dsms-gateway:
validate-canonical-controls: # JSON Validierung
go-lint: # Nur bei PRs
python-lint: # Nur bei PRs
nodejs-lint: # Nur bei PRs
``` ```
### 4. Logs prüfen ## Lokale Entwicklung (Mac Mini)
Fuer lokale Tests ohne Coolify:
```bash ```bash
# Container-Logs anzeigen # Auf Mac Mini pullen und bauen
ssh macmini "/usr/local/bin/docker compose \ ssh macmini "git -C ~/Projekte/breakpilot-compliance pull --no-rebase origin main"
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \ ssh macmini "/usr/local/bin/docker compose -f ~/Projekte/breakpilot-compliance/docker-compose.yml build --no-cache <service>"
logs -f <service-name>" ssh macmini "/usr/local/bin/docker compose -f ~/Projekte/breakpilot-compliance/docker-compose.yml up -d <service>"
```
## Service-spezifische Deployments
### Next.js Frontend (studio-v2, admin-v2, website)
```bash
# 1. Sync
rsync -avz --delete \
--exclude 'node_modules' --exclude '.next' --exclude '.git' \
/Users/benjaminadmin/Projekte/breakpilot-pwa/studio-v2/ \
macmini:/Users/benjaminadmin/Projekte/breakpilot-pwa/studio-v2/
# 2. Build & Deploy
ssh macmini "/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
build --no-cache studio-v2 && \
/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
up -d studio-v2"
```
### Python Services (backend, klausur-service, voice-service)
```bash
# Build mit requirements.txt
ssh macmini "/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
build klausur-service && \
/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
up -d klausur-service"
```
### Go Services (consent-service, ai-compliance-sdk)
```bash
# Multi-stage Build (Go → Alpine)
ssh macmini "/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
build --no-cache consent-service && \
/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
up -d consent-service"
```
### MkDocs Dokumentation
```bash
# Build & Deploy
ssh macmini "/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
build --no-cache docs && \
/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
up -d docs"
# Verfügbar unter: http://macmini:8009
```
## Health Checks
### Service-Status prüfen
```bash
# Alle Container-Status
ssh macmini "docker ps --format 'table {{.Names}}\t{{.Status}}\t{{.Ports}}'"
# Health-Endpoints prüfen
curl -s http://macmini:8000/health
curl -s http://macmini:8081/health
curl -s http://macmini:8086/health
curl -s http://macmini:8090/health
```
### Logs analysieren
```bash
# Letzte 100 Zeilen
ssh macmini "docker logs --tail 100 breakpilot-pwa-backend-1"
# Live-Logs folgen
ssh macmini "docker logs -f breakpilot-pwa-backend-1"
```
## Rollback
### Container auf vorherige Version zurücksetzen
```bash
# 1. Aktuelles Image taggen
ssh macmini "docker tag breakpilot-pwa-backend:latest breakpilot-pwa-backend:backup"
# 2. Altes Image deployen
ssh macmini "/usr/local/bin/docker compose \
-f /Users/benjaminadmin/Projekte/breakpilot-pwa/docker-compose.yml \
up -d backend"
# 3. Bei Problemen: Backup wiederherstellen
ssh macmini "docker tag breakpilot-pwa-backend:backup breakpilot-pwa-backend:latest"
``` ```
## Troubleshooting ## Troubleshooting
### Container startet nicht ### CI-Status pruefen
```bash ```bash
# 1. Logs prüfen # Im Browser:
ssh macmini "docker logs breakpilot-pwa-<service>-1" # https://gitea.meghsakha.com/Benjamin_Boenisch/breakpilot-compliance/actions
# 2. Container manuell starten für Debug-Output
ssh macmini "docker compose -f .../docker-compose.yml run --rm <service>"
# 3. In Container einloggen
ssh macmini "docker exec -it breakpilot-pwa-<service>-1 /bin/sh"
``` ```
### Port bereits belegt ### Container-Logs (lokal)
```bash ```bash
# Port-Belegung prüfen ssh macmini "/usr/local/bin/docker logs -f bp-compliance-<service>"
ssh macmini "lsof -i :8000"
# Container mit dem Port finden
ssh macmini "docker ps --filter publish=8000"
``` ```
### Build-Fehler ### Build-Fehler
```bash ```bash
# Cache komplett leeren # Lokalen Build-Cache leeren
ssh macmini "docker builder prune -a" ssh macmini "/usr/local/bin/docker builder prune -a"
# Ohne Cache bauen
ssh macmini "docker compose build --no-cache <service>"
```
## Monitoring
### Resource-Nutzung
```bash
# CPU/Memory aller Container
ssh macmini "docker stats --no-stream"
# Disk-Nutzung
ssh macmini "docker system df"
```
### Cleanup
```bash
# Ungenutzte Images/Container entfernen
ssh macmini "docker system prune -a --volumes"
# Nur dangling Images
ssh macmini "docker image prune"
```
## Umgebungsvariablen
Umgebungsvariablen werden über `.env` Dateien und docker-compose.yml verwaltet:
```yaml
# docker-compose.yml
services:
backend:
environment:
- DATABASE_URL=postgresql://...
- REDIS_URL=redis://valkey:6379
- SECRET_KEY=${SECRET_KEY}
```
**Wichtig**: Sensible Werte niemals in Git committen. Stattdessen:
- `.env` Datei auf dem Server pflegen
- Secrets über HashiCorp Vault (siehe unten)
## Woodpecker CI - Automatisierte OAuth Integration
### Überblick
Die OAuth-Integration zwischen Woodpecker CI und Gitea ist **vollständig automatisiert**. Credentials werden in HashiCorp Vault gespeichert und bei Bedarf automatisch regeneriert.
!!! info "Warum automatisiert?"
Diese Automatisierung ist eine DevSecOps Best Practice:
- **Infrastructure-as-Code**: Alles ist reproduzierbar
- **Disaster Recovery**: Verlorene Credentials können automatisch regeneriert werden
- **Security**: Secrets werden zentral in Vault verwaltet
- **Onboarding**: Neue Entwickler müssen nichts manuell konfigurieren
### Architektur
```
┌─────────────────────────────────────────────────────────────────┐
│ Mac Mini Server │
│ │
│ ┌───────────────┐ OAuth 2.0 ┌───────────────┐ │
│ │ Gitea │ ←─────────────────────────→│ Woodpecker │ │
│ │ (Port 3003) │ Client ID + Secret │ (Port 8090) │ │
│ └───────────────┘ └───────────────┘ │
│ │ │ │
│ │ OAuth App │ Env Vars│
│ │ (DB: oauth2_application) │ │
│ │ │ │
│ ▼ ▼ │
│ ┌───────────────────────────────────────────────────────────┐ │
│ │ HashiCorp Vault (Port 8200) │ │
│ │ │ │
│ │ secret/cicd/woodpecker: │ │
│ │ - gitea_client_id │ │
│ │ - gitea_client_secret │ │
│ │ │ │
│ │ secret/cicd/api-tokens: │ │
│ │ - gitea_token (für API-Zugriff) │ │
│ │ - woodpecker_token (für Pipeline-Trigger) │ │
│ └───────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
```
### Credentials-Speicherorte
| Ort | Pfad | Inhalt |
|-----|------|--------|
| **HashiCorp Vault** | `secret/cicd/woodpecker` | Client ID + Secret (Quelle der Wahrheit) |
| **.env Datei** | `WOODPECKER_GITEA_CLIENT/SECRET` | Für Docker Compose (aus Vault geladen) |
| **Gitea PostgreSQL** | `oauth2_application` Tabelle | OAuth App Registration (gehashtes Secret) |
### Troubleshooting: OAuth Fehler
Falls der Fehler "Client ID not registered" oder "user does not exist [uid: 0]" auftritt:
```bash
# Option 1: Automatisches Regenerieren (empfohlen)
./scripts/sync-woodpecker-credentials.sh --regenerate
# Option 2: Manuelles Vorgehen
# 1. Credentials aus Vault laden
vault kv get secret/cicd/woodpecker
# 2. .env aktualisieren
WOODPECKER_GITEA_CLIENT=<client_id>
WOODPECKER_GITEA_SECRET=<client_secret>
# 3. Zu Mac Mini synchronisieren
rsync .env macmini:~/Projekte/breakpilot-pwa/
# 4. Woodpecker neu starten
ssh macmini "cd ~/Projekte/breakpilot-pwa && \
docker compose up -d --force-recreate woodpecker-server"
```
### Das Sync-Script
Das Script `scripts/sync-woodpecker-credentials.sh` automatisiert den gesamten Prozess:
```bash
# Credentials aus Vault laden und .env aktualisieren
./scripts/sync-woodpecker-credentials.sh
# Neue Credentials generieren (OAuth App in Gitea + Vault + .env)
./scripts/sync-woodpecker-credentials.sh --regenerate
```
Was das Script macht:
1. **Liest** die aktuellen Credentials aus Vault
2. **Aktualisiert** die .env Datei automatisch
3. **Bei `--regenerate`**:
- Löscht alte OAuth Apps in Gitea
- Erstellt neue OAuth App mit neuem Client ID/Secret
- Speichert Credentials in Vault
- Aktualisiert .env
### Vault-Zugriff
```bash
# Vault Token (Development)
export VAULT_TOKEN=breakpilot-dev-token
# Credentials lesen
docker exec -e VAULT_TOKEN=$VAULT_TOKEN breakpilot-pwa-vault \
vault kv get secret/cicd/woodpecker
# Credentials setzen
docker exec -e VAULT_TOKEN=$VAULT_TOKEN breakpilot-pwa-vault \
vault kv put secret/cicd/woodpecker \
gitea_client_id="..." \
gitea_client_secret="..."
```
### Services neustarten nach Credentials-Änderung
```bash
# Wichtig: --force-recreate um neue Env Vars zu laden
cd /Users/benjaminadmin/Projekte/breakpilot-pwa
docker compose up -d --force-recreate woodpecker-server
# Logs prüfen
docker logs breakpilot-pwa-woodpecker-server --tail 50
``` ```

View File

@@ -64,131 +64,39 @@ Module die Compliance-Kunden im SDK sehen und nutzen:
| **Document Crawler** | Automatisches Crawling von Rechtstexten | /sdk/document-crawler | | **Document Crawler** | Automatisches Crawling von Rechtstexten | /sdk/document-crawler |
| **Advisory Board** | KI-Compliance-Beirat | /sdk/advisory-board | | **Advisory Board** | KI-Compliance-Beirat | /sdk/advisory-board |
## Admin-Module (Plattform-Verwaltung)
Interne Tools fuer die BreakPilot-Plattformverwaltung:
| Modul | Beschreibung | Frontend |
|-------|--------------|----------|
| **Katalogverwaltung** | SDK-Kataloge & Auswahltabellen | /dashboard/catalog-manager |
| **Mandantenverwaltung** | B2B-Kundenverwaltung & Mandanten | /dashboard/multi-tenant |
| **SSO-Konfiguration** | Single Sign-On & Authentifizierung | /dashboard/sso |
| **DSB Portal** | Datenschutzbeauftragter-Arbeitsbereich | /dashboard/dsb-portal |
--- ---
## URLs ## URLs
### Production (Coolify-deployed)
| URL | Service | Beschreibung | | URL | Service | Beschreibung |
|-----|---------|--------------| |-----|---------|--------------|
| https://macmini:3007/ | Admin Compliance | Compliance-Dashboard | | https://admin-dev.breakpilot.ai/ | Admin Compliance | Compliance-Dashboard |
| https://macmini:3006/ | Developer Portal | API-Dokumentation | | https://developers-dev.breakpilot.ai/ | Developer Portal | API-Dokumentation |
| https://macmini:8002/ | Backend API | Compliance REST API | | https://api-dev.breakpilot.ai/ | Backend API | Compliance REST API |
| https://macmini:8093/ | AI SDK API | SDK Backend-API | | https://sdk-dev.breakpilot.ai/ | AI SDK API | SDK Backend-API |
### SDK-Module (Admin Compliance) ### Lokal (Mac Mini — nur Dev/Tests)
| URL | Modul | | URL | Service |
|-----|-------| |-----|---------|
| https://macmini:3007/sdk | SDK Uebersicht | | https://macmini:3007/ | Admin Compliance |
| https://macmini:3007/sdk/requirements | Requirements | | https://macmini:3006/ | Developer Portal |
| https://macmini:3007/sdk/controls | Controls | | https://macmini:8002/ | Backend API |
| https://macmini:3007/sdk/evidence | Evidence | | https://macmini:8093/ | AI SDK API |
| https://macmini:3007/sdk/risks | Risk Matrix |
| https://macmini:3007/sdk/ai-act | AI Act |
| https://macmini:3007/sdk/audit-checklist | Audit Checklist |
| https://macmini:3007/sdk/audit-report | Audit Report |
| https://macmini:3007/sdk/obligations | Obligations v2 |
| https://macmini:3007/sdk/iace | IACE (CE-Risikobeurteilung) |
| https://macmini:3007/sdk/import | Document Import |
| https://macmini:3007/sdk/screening | System Screening |
| https://macmini:3007/sdk/rag | RAG/Quellen |
| https://macmini:3007/sdk/tom | TOM |
| https://macmini:3007/sdk/dsfa | DSFA |
| https://macmini:3007/sdk/vvt | VVT |
| https://macmini:3007/sdk/loeschfristen | Loeschfristen |
| https://macmini:3007/sdk/email-templates | E-Mail-Templates |
| https://macmini:3007/sdk/academy | Academy |
| https://macmini:3007/sdk/training | Training Engine |
| https://macmini:3007/sdk/whistleblower | Whistleblower |
| https://macmini:3007/sdk/incidents | Incidents |
| https://macmini:3007/sdk/reporting | Reporting |
| https://macmini:3007/sdk/vendor-compliance | Vendor Compliance |
| https://macmini:3007/sdk/industry-templates | Branchenvorlagen |
| https://macmini:3007/sdk/document-crawler | Document Crawler |
| https://macmini:3007/sdk/advisory-board | Advisory Board |
### Admin-Module (Dashboard)
| URL | Modul |
|-----|-------|
| https://macmini:3007/dashboard | Dashboard |
| https://macmini:3007/dashboard/catalog-manager | Katalogverwaltung |
| https://macmini:3007/dashboard/multi-tenant | Mandantenverwaltung |
| https://macmini:3007/dashboard/sso | SSO-Konfiguration |
| https://macmini:3007/dashboard/dsb-portal | DSB Portal |
---
## Abhaengigkeiten zu Core
Compliance-Services nutzen folgende Core-Infrastruktur:
| Core Service | Genutzt von | Zweck |
|-------------|-------------|-------|
| PostgreSQL (46.225.100.82:54321, extern) | Alle | Compliance-Datenbank (Hetzner/meghshakka, TLS) |
| Valkey (6379) | Backend, Admin | Session Cache |
| Vault (8200) | Alle | Secrets Management |
| Qdrant (qdrant-dev.breakpilot.ai) | AI SDK, Document Crawler | Vector-Suche (gehostet, API-Key) |
| Hetzner Object Storage | TTS Service, Document Crawler | Datei-Storage (S3-kompatibel) |
| Embedding (8087) | AI SDK | Text-Embeddings |
| RAG Service (8097) | AI SDK | Retrieval Augmented Generation |
| Nginx | Alle | HTTPS Reverse Proxy |
---
## Services-Dokumentation
- [AI Compliance SDK](services/ai-compliance-sdk/index.md)
- [Architektur](services/ai-compliance-sdk/ARCHITECTURE.md)
- [Developer Guide](services/ai-compliance-sdk/DEVELOPER.md)
- [Auditor-Dokumentation](services/ai-compliance-sdk/AUDITOR_DOCUMENTATION.md)
- [SBOM](services/ai-compliance-sdk/SBOM.md)
- [Document Crawler](services/document-crawler/index.md)
- SDK-Module:
- [Analyse-Module (Paket 2)](services/sdk-modules/analyse-module.md) — Requirements, Controls, Evidence, Risk Matrix, AI Act, Audit Checklist, Audit Report
- [Dokumentations-Module (Paket 3+)](services/sdk-modules/dokumentations-module.md) — VVT, Source Policy, Document Generator, Audit Checklist, Training Engine
- [DSFA (Art. 35 DSGVO)](services/sdk-modules/dsfa.md) — vollständig backend-persistent, Migration 024
- [Rechtliche Texte (Paket 4)](services/sdk-modules/rechtliche-texte.md) — Einwilligungen, Consent, Cookie Banner, Workflow
- [Academy](services/sdk-modules/academy.md)
- [Whistleblower](services/sdk-modules/whistleblower.md)
- [Incidents](services/sdk-modules/incidents.md)
- [Reporting](services/sdk-modules/reporting.md)
- [Vendors](services/sdk-modules/vendors.md)
- [Industry Templates](services/sdk-modules/industry-templates.md)
- [Document Crawler](services/sdk-modules/document-crawler.md)
- [Advisory Board](services/sdk-modules/advisory-board.md)
- [DSB Portal](services/sdk-modules/dsb-portal.md)
## Entwicklung
- [Testing](development/testing.md)
- [Dokumentation](development/documentation.md)
- [CI/CD Pipeline](development/ci-cd-pipeline.md)
--- ---
## Deployment ## Deployment
```bash ```bash
# Voraussetzung: breakpilot-core muss laufen # Production (Coolify — Standardweg):
git push origin main && git push gitea main
# Coolify baut und deployt automatisch.
# Alle Compliance-Services starten # Lokal (Mac Mini — nur Dev/Tests):
docker compose -f breakpilot-compliance/docker-compose.yml up -d docker compose -f breakpilot-compliance/docker-compose.yml up -d
# Einzelnen Service neu bauen
docker compose -f breakpilot-compliance/docker-compose.yml build --no-cache <service>
docker compose -f breakpilot-compliance/docker-compose.yml up -d <service>
``` ```
--- ---
@@ -203,3 +111,17 @@ git push origin main && git push gitea main
# origin: http://macmini:3003/pilotadmin/breakpilot-compliance.git # origin: http://macmini:3003/pilotadmin/breakpilot-compliance.git
# gitea: git@gitea.meghsakha.com:Benjamin_Boenisch/breakpilot-compliance.git # gitea: git@gitea.meghsakha.com:Benjamin_Boenisch/breakpilot-compliance.git
``` ```
---
## Services-Dokumentation
- [AI Compliance SDK](services/ai-compliance-sdk/index.md)
- [Document Crawler](services/document-crawler/index.md)
- SDK-Module: siehe Unterverzeichnisse
## Entwicklung
- [Testing](development/testing.md)
- [Dokumentation](development/documentation.md)
- [CI/CD Pipeline](development/ci-cd-pipeline.md)

View File

@@ -128,7 +128,7 @@ KI-generierte Inhalte werden via `compliance-tts-service` (Port 8095) in Audio u
- **Audio:** Piper TTS → MP3 (Modell: `de_DE-thorsten-high.onnx`) - **Audio:** Piper TTS → MP3 (Modell: `de_DE-thorsten-high.onnx`)
- **Video:** FFmpeg → MP4 (Skript + Stimme + Untertitel) - **Video:** FFmpeg → MP4 (Skript + Stimme + Untertitel)
- **Storage:** Hetzner Object Storage (`nbg1.your-objectstorage.com`, S3-kompatibel) - **Storage:** S3-kompatibles Object Storage (TLS)
``` ```
AudioPlayer → /sdk/v1/training/modules/:id/media (audio) AudioPlayer → /sdk/v1/training/modules/:id/media (audio)